论文标题
在产出规律性假设下,矢量值最小二乘回归
Vector-Valued Least-Squares Regression under Output Regularity Assumptions
论文作者
论文摘要
我们提出并分析了一种减少级别的方法,用于解决最小二乘回归问题,并使用无限的尺寸产出。我们为我们的方法得出学习界限,并研究与全级方法相比,在哪些统计绩效中提高了统计绩效。我们的分析将减少级别回归的兴趣扩展到标准低级设置之外,到更一般的定期假设。我们说明了关于合成最小二乘问题问题的理论见解。然后,我们提出了一种从这种降级方法得出的替代结构化预测方法。我们在三个不同的问题上评估了它的好处:图像重建,多标签分类和代谢物识别。
We propose and analyse a reduced-rank method for solving least-squares regression problems with infinite dimensional output. We derive learning bounds for our method, and study under which setting statistical performance is improved in comparison to full-rank method. Our analysis extends the interest of reduced-rank regression beyond the standard low-rank setting to more general output regularity assumptions. We illustrate our theoretical insights on synthetic least-squares problems. Then, we propose a surrogate structured prediction method derived from this reduced-rank method. We assess its benefits on three different problems: image reconstruction, multi-label classification, and metabolite identification.