论文标题

RankNeat:优先在首选项学习任务中优于随机梯度搜索

RankNEAT: Outperforming Stochastic Gradient Search in Preference Learning Tasks

论文作者

Pinitas, Kosmas, Makantasis, Konstantinos, Liapis, Antonios, Yannakakis, Georgios N.

论文摘要

随机梯度下降(SGD)是训练神经网络的高级优化方法,尤其是用于学习客观定义的标签,例如图像对象和事件。当神经网络面临主观定义的标签时(例如人类的示范或注释),S​​GD可能难以探索由人类固有的偏见和主观性引起的欺骗性和嘈杂的损失景观。尽管通常通过偏好学习算法对神经网络进行训练,以消除此类数据噪声,但事实上的训练方法依赖于梯度下降。由于缺乏关于进化搜索对偏好学习者培训的影响的实证研究的动机,我们引入了RankNeat算法,该算法学会通过增强拓扑的神经进化来排名。我们测试了以下假设,即在情感计算领域中胜过基于梯度的偏好学习,特别是从三个不同游戏的游戏录像中唤醒了带注释的玩家。与基于梯度的偏好学习者(RankNet)相比,RankNeat在大多数实验中的架构优化能力可作为有效的特征选择机制,从而消除了过度拟合。结果表明,RankNeat是偏好学习的可行且高效的进化替代方案。

Stochastic gradient descent (SGD) is a premium optimization method for training neural networks, especially for learning objectively defined labels such as image objects and events. When a neural network is instead faced with subjectively defined labels--such as human demonstrations or annotations--SGD may struggle to explore the deceptive and noisy loss landscapes caused by the inherent bias and subjectivity of humans. While neural networks are often trained via preference learning algorithms in an effort to eliminate such data noise, the de facto training methods rely on gradient descent. Motivated by the lack of empirical studies on the impact of evolutionary search to the training of preference learners, we introduce the RankNEAT algorithm which learns to rank through neuroevolution of augmenting topologies. We test the hypothesis that RankNEAT outperforms traditional gradient-based preference learning within the affective computing domain, in particular predicting annotated player arousal from the game footage of three dissimilar games. RankNEAT yields superior performances compared to the gradient-based preference learner (RankNet) in the majority of experiments since its architecture optimization capacity acts as an efficient feature selection mechanism, thereby, eliminating overfitting. Results suggest that RankNEAT is a viable and highly efficient evolutionary alternative to preference learning.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源