论文标题

Meta-wrapper:用于CTR预测中用户兴趣选择的可区分包装运算符

Meta-Wrapper: Differentiable Wrapping Operator for User Interest Selection in CTR Prediction

论文作者

Cao, Tianwei, Xu, Qianqian, Yang, Zhiyong, Huang, Qingming

论文摘要

点击率(CTR)预测的目标是预测用户点击项目的可能性,在推荐系统中变得越来越重要。最近,一些具有自动从他/她的行为中吸引用户兴趣的深度学习模型取得了巨大的成功。在这些工作中,注意机制用于在历史行为中选择用户感兴趣的项目,从而提高CTR预测指标的性能。通常,这些细心的模块可以通过使用梯度下降与基本预测器共同训练。在本文中,我们将用户兴趣建模视为特征选择问题,我们称之为用户兴趣选择。对于这样一个问题,我们在包装方法的框架下提出了一种新颖的方法,该方法被称为Meta-wrapper。更具体地说,我们使用一个可区分的模块作为包装运算符,然后将其学习问题重新提出为连续的二元优化。此外,我们使用元学习算法来求解优化并理论上证明其收敛性。同时,我们还提供了理论分析,以表明我们提出的方法1)效率基于包装器的特征选择,而2)可以更好地抵抗过度拟合。最后,在三个公共数据集上进行的广泛实验表明了我们方法在提高CTR预测的性能方面的优势。

Click-through rate (CTR) prediction, whose goal is to predict the probability of the user to click on an item, has become increasingly significant in the recommender systems. Recently, some deep learning models with the ability to automatically extract the user interest from his/her behaviors have achieved great success. In these work, the attention mechanism is used to select the user interested items in historical behaviors, improving the performance of the CTR predictor. Normally, these attentive modules can be jointly trained with the base predictor by using gradient descents. In this paper, we regard user interest modeling as a feature selection problem, which we call user interest selection. For such a problem, we propose a novel approach under the framework of the wrapper method, which is named Meta-Wrapper. More specifically, we use a differentiable module as our wrapping operator and then recast its learning problem as a continuous bilevel optimization. Moreover, we use a meta-learning algorithm to solve the optimization and theoretically prove its convergence. Meanwhile, we also provide theoretical analysis to show that our proposed method 1) efficiencies the wrapper-based feature selection, and 2) achieves better resistance to overfitting. Finally, extensive experiments on three public datasets manifest the superiority of our method in boosting the performance of CTR prediction.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源