论文标题

控制复杂性和Lipschitz常数可改善多项式网

Controlling the Complexity and Lipschitz Constant improves polynomial nets

论文作者

Zhu, Zhenyu, Latorre, Fabian, Chrysos, Grigorios G, Cevher, Volkan

论文摘要

尽管多项式网类表现出与神经网络(NN)的可比性,但目前既没有理论概括性表征,也没有鲁棒性保证。为此,我们以$ \ ell_ \ ell_ \ ell_ \ eell_ \ infty $ operator-norm和$ \ ell_2 $ -operator norm norm norm norm nord n of $ \ ell_ \ ell_ \ elfty $ - operator-norm和$ \ ell_ \ elfty $ - operator-norm and $ \ ell_ \ eld_2 $ - operator norm norm norm n of $ \ ell_ \ eld_2 $ - operator norm,我们为耦合的CP分解(CCP)和嵌套耦合的CP分解(NCP)模型提供了新的复杂性界限。此外,我们在Lipschitz常数上得出了界限,以使两种模型建立其稳健性的理论证书。理论结果使我们能够提出一种原则性的正则化方案,我们还在六个数据集中对实验进行了评估,并表明它提高了模型对对抗性扰动的精度以及鲁棒性。我们展示了如何将这种正则化与对抗训练相结合,从而进一步改进。

While the class of Polynomial Nets demonstrates comparable performance to neural networks (NN), it currently has neither theoretical generalization characterization nor robustness guarantees. To this end, we derive new complexity bounds for the set of Coupled CP-Decomposition (CCP) and Nested Coupled CP-decomposition (NCP) models of Polynomial Nets in terms of the $\ell_\infty$-operator-norm and the $\ell_2$-operator norm. In addition, we derive bounds on the Lipschitz constant for both models to establish a theoretical certificate for their robustness. The theoretical results enable us to propose a principled regularization scheme that we also evaluate experimentally in six datasets and show that it improves the accuracy as well as the robustness of the models to adversarial perturbations. We showcase how this regularization can be combined with adversarial training, resulting in further improvements.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源