论文标题

可证明培训具有非convex约束的过度参数化神经网络分类器

Provably Training Overparameterized Neural Network Classifiers with Non-convex Constraints

论文作者

Chen, You-Lin, Wang, Zhaoran, Kolar, Mladen

论文摘要

由于其广泛的应用程序(例如算法公平性和类别不平衡的分类),培训分类器在非凸限制下的分类器在机器学习社区中引起了越来越多的关注。但是,最近针对非凸约限制的最近的一些著作仅集中在简单的模型上,例如逻辑回归或支持向量机。神经网络是当今分类最受欢迎的模型之一,被排除在外,缺乏理论保证。在这项工作中,我们表明,过度参数化的神经网络可以通过项目随机梯度下降实现近乎最佳且几乎可行的限制优化问题的解决方案。我们的关键要素是对过度参数化制度中神经网络的在线学习的无重组分析,这可能对在线学习应用程序具有独立的兴趣。

Training a classifier under non-convex constraints has gotten increasing attention in the machine learning community thanks to its wide range of applications such as algorithmic fairness and class-imbalanced classification. However, several recent works addressing non-convex constraints have only focused on simple models such as logistic regression or support vector machines. Neural networks, one of the most popular models for classification nowadays, are precluded and lack theoretical guarantees. In this work, we show that overparameterized neural networks could achieve a near-optimal and near-feasible solution of non-convex constrained optimization problems via the project stochastic gradient descent. Our key ingredient is the no-regret analysis of online learning for neural networks in the overparameterization regime, which may be of independent interest in online learning applications.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源