论文标题

物理信息(和)操作员学习的近似错误的通用界限

Generic bounds on the approximation error for physics-informed (and) operator learning

论文作者

De Ryck, Tim, Mishra, Siddhartha

论文摘要

我们提出了一个非常通用的框架,用于在物理知识神经网络(PINN)和操作员学习体系结构(例如DeepOnets and FNOS)以及物理知识的操作员学习等近似错误上得出严格的界限。这些界限确保PINN和(物理信息)或FNO将有效地近似于通用部分微分方程(PDE)的基础解决方案或解决方案操作员。我们的框架利用现有的神经网络近似结果来获得有关PDE的更多涉及的学习体系结构的界限。我们通过在物理知识运算符学习的近似误差上得出第一个严格的界限来说明一般框架,并通过显示PINN(以及PINNICS-INFORMICS已知DeptoNets和FNOS)减轻了尺寸的诅咒,以近似于非线性抛物线PDES。

We propose a very general framework for deriving rigorous bounds on the approximation error for physics-informed neural networks (PINNs) and operator learning architectures such as DeepONets and FNOs as well as for physics-informed operator learning. These bounds guarantee that PINNs and (physics-informed) DeepONets or FNOs will efficiently approximate the underlying solution or solution operator of generic partial differential equations (PDEs). Our framework utilizes existing neural network approximation results to obtain bounds on more involved learning architectures for PDEs. We illustrate the general framework by deriving the first rigorous bounds on the approximation error of physics-informed operator learning and by showing that PINNs (and physics-informed DeepONets and FNOs) mitigate the curse of dimensionality in approximating nonlinear parabolic PDEs.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源