论文标题

树张量网络的近似理论:张量化的单变量函数 - 第二部分

Approximation Theory of Tree Tensor Networks: Tensorized Univariate Functions -- Part II

论文作者

Ali, Mazen, Nouy, Anthony

论文摘要

我们研究了经典平滑度类功能的张量网络(TNS)的近似值。所考虑的近似工具结合了$ l^p([0,1))$的函数的张力,该函数允许使用多变量函数(或张量)识别单变量函数,以及使用树张量网络(张量列车格式)来利用低级函数的低级别结构。所得工具可以解释为馈送前向神经网络,第一层实现了张力,被解释为特定的特征步骤,其次是具有稀疏体系结构的总和网络。在这项工作的第一部分中,我们介绍了与张量网络复杂性不同的不同度量相关的几个近似类,并研究了它们的属性。在这项工作(第二部分)中,我们展示了如何将经典近似工具(例如多项式或固定或自由结)进行编码为具有控制复杂性的张量网络。我们使用它来得出张量网络的近似空间的直接(杰克逊)不平等。然后将其用于表明将BESOV空间连续嵌入这些近似空间中。换句话说,我们表明,任意BESOV函数可以以最佳或接近最佳速度近似。我们还表明,近似类中的任意函数不具有平滑度,除非一个人限制了张量网络的深度。

We study the approximation by tensor networks (TNs) of functions from classical smoothness classes. The considered approximation tool combines a tensorization of functions in $L^p([0,1))$, which allows to identify a univariate function with a multivariate function (or tensor), and the use of tree tensor networks (the tensor train format) for exploiting low-rank structures of multivariate functions. The resulting tool can be interpreted as a feed-forward neural network, with first layers implementing the tensorization, interpreted as a particular featuring step, followed by a sum-product network with sparse architecture. In part I of this work, we presented several approximation classes associated with different measures of complexity of tensor networks and studied their properties. In this work (part II), we show how classical approximation tools, such as polynomials or splines (with fixed or free knots), can be encoded as a tensor network with controlled complexity. We use this to derive direct (Jackson) inequalities for the approximation spaces of tensor networks. This is then utilized to show that Besov spaces are continuously embedded into these approximation spaces. In other words, we show that arbitrary Besov functions can be approximated with optimal or near to optimal rate. We also show that an arbitrary function in the approximation class possesses no Besov smoothness, unless one limits the depth of the tensor network.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源