论文标题
具有检查点的神经网络中的顺序更改点检测
Sequential Changepoint Detection in Neural Networks with Checkpoints
论文作者
论文摘要
我们介绍了一个用于在线更改点检测和同时模型学习的框架,该框架适用于高度参数化的模型,例如深神经网络。它基于在时间上通过依次执行的广义似然比测试来检测更改点,该测试仅需要评估简单的预测得分函数。此过程利用了检查点,该检查点由实际模型参数的早期版本组成,可以通过对未来数据进行预测来检测分布更改。我们定义了一种在顺序测试过程中界定I类误差的算法。我们证明了我们的方法在用未知的任务更改点挑战连续学习应用方面的效率,并且与在线贝叶斯更改点检测相比,表现出改善的性能。
We introduce a framework for online changepoint detection and simultaneous model learning which is applicable to highly parametrized models, such as deep neural networks. It is based on detecting changepoints across time by sequentially performing generalized likelihood ratio tests that require only evaluations of simple prediction score functions. This procedure makes use of checkpoints, consisting of early versions of the actual model parameters, that allow to detect distributional changes by performing predictions on future data. We define an algorithm that bounds the Type I error in the sequential testing procedure. We demonstrate the efficiency of our method in challenging continual learning applications with unknown task changepoints, and show improved performance compared to online Bayesian changepoint detection.