论文标题

从深到浅:繁殖内核孔隙空间和无限期支持向量机中的深网的等效形式

From deep to Shallow: Equivalent Forms of Deep Networks in Reproducing Kernel Krein Space and Indefinite Support Vector Machines

论文作者

Shilton, Alistair, Gupta, Sunil, Rana, Santu, Venkatesh, Svetha

论文摘要

在本文中,我们探讨了深网与学习中的学习之间的联系。我们的方法基于推动方向的概念 - 也就是说,在线性投影上采用固定的非线性变换,然后将其转换为固定非线性变换的输出的线性投影,从而通过非线性向前推动权重。从输入到深网的输出,可以逐步将其应用于输出层,从而产生平坦的网络,该网络具有固定的非线性地图(其形式由深网的结构确定的形式),然后是线性预测,然后由frofe Matrices确定 - 我们将其转换为深层网络,然后将其转换为equartial equartial equantial equreTial kern equrential kern kern kern kern kern kern。然后,我们研究了这种转变对容量控制和均匀收敛的含义,并在重现核核素空间的Rademacher复杂性方面提供了在深网上绑定的Rademacher复杂性。最后,我们分析了平面表示的稀疏性能,表明平面重量是(有效)Lp-“ Norm”,用0 <P <1(桥回归)正则化。

In this paper we explore a connection between deep networks and learning in reproducing kernel Krein space. Our approach is based on the concept of push-forward - that is, taking a fixed non-linear transform on a linear projection and converting it to a linear projection on the output of a fixed non-linear transform, pushing the weights forward through the non-linearity. Applying this repeatedly from the input to the output of a deep network, the weights can be progressively "pushed" to the output layer, resulting in a flat network that has the form of a fixed non-linear map (whose form is determined by the structure of the deep network) followed by a linear projection determined by the weight matrices - that is, we take a deep network and convert it to an equivalent (indefinite) kernel machine. We then investigate the implications of this transformation for capacity control and uniform convergence, and provide a Rademacher complexity bound on the deep network in terms of Rademacher complexity in reproducing kernel Krein space. Finally, we analyse the sparsity properties of the flat representation, showing that the flat weights are (effectively) Lp-"norm" regularised with 0<p<1 (bridge regression).

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源