论文标题
通过隐式复合内核将先验知识纳入神经网络
Incorporating Prior Knowledge into Neural Networks through an Implicit Composite Kernel
论文作者
论文摘要
具有先验知识指导神经网络(NN)学习是一项挑战。相比之下,许多已知的特性,例如空间平滑度或季节性,可以通过在高斯工艺(GP)中选择合适的内核来直接建模。通过对这些已知属性进行建模,可以增强许多深度学习应用程序。例如,卷积神经网络(CNN)经常用于遥感,这会受到强烈的季节性影响。我们建议通过使用将神经网络与第二个内核函数隐含定义的内核结合的复合核,将深度学习的优势和GPS的清晰建模能力融合在一起,以模拟已知特性(例如季节性)。我们通过将深层网络和基于Nystrom近似的有效映射组合来实现此想法,我们称之为隐式复合内核(ICK)。然后,我们采用样本 - 优化的方法来近似GP后部分布。我们证明,ICK在合成和现实世界数据集上具有较高的性能和灵活性。我们认为,ICK框架可用于在许多应用程序中将先前的信息包括到神经网络中。
It is challenging to guide neural network (NN) learning with prior knowledge. In contrast, many known properties, such as spatial smoothness or seasonality, are straightforward to model by choosing an appropriate kernel in a Gaussian process (GP). Many deep learning applications could be enhanced by modeling such known properties. For example, convolutional neural networks (CNNs) are frequently used in remote sensing, which is subject to strong seasonal effects. We propose to blend the strengths of deep learning and the clear modeling capabilities of GPs by using a composite kernel that combines a kernel implicitly defined by a neural network with a second kernel function chosen to model known properties (e.g., seasonality). We implement this idea by combining a deep network and an efficient mapping based on the Nystrom approximation, which we call Implicit Composite Kernel (ICK). We then adopt a sample-then-optimize approach to approximate the full GP posterior distribution. We demonstrate that ICK has superior performance and flexibility on both synthetic and real-world data sets. We believe that ICK framework can be used to include prior information into neural networks in many applications.