论文标题

用比例依赖的内核进行插值和学习

Interpolation and Learning with Scale Dependent Kernels

论文作者

Pagliana, Nicolò, Rudi, Alessandro, De Vito, Ernesto, Rosasco, Lorenzo

论文摘要

我们研究非参数无脊的最小二乘的学习特性。特别是,我们考虑了由量表依赖核定义的估计量的常见情况,并专注于量表的作用。这些估计器可插入数据和量表可以通过条件数来控制其稳定性。我们的分析表明,根据样本量,其维度和问题的平滑度之间的相互作用,这是不同的制度。实际上,当样本量小于数据维度的指数时,可以选择比例,以使学习误差减小。随着样本量变大,总体误差停止减小,但有趣的是,可以选择刻度以使噪声差异保持界限。我们的分析结合了概率结果与插值理论的许多分析技术结合在一起。

We study the learning properties of nonparametric ridge-less least squares. In particular, we consider the common case of estimators defined by scale dependent kernels, and focus on the role of the scale. These estimators interpolate the data and the scale can be shown to control their stability through the condition number. Our analysis shows that are different regimes depending on the interplay between the sample size, its dimensions, and the smoothness of the problem. Indeed, when the sample size is less than exponential in the data dimension, then the scale can be chosen so that the learning error decreases. As the sample size becomes larger, the overall error stop decreasing but interestingly the scale can be chosen in such a way that the variance due to noise remains bounded. Our analysis combines, probabilistic results with a number of analytic techniques from interpolation theory.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源