论文标题

贝节:通过贝叶斯微调,是贝叶斯人,廉价且可靠地可靠

BayesAdapter: Being Bayesian, Inexpensively and Reliably, via Bayesian Fine-tuning

论文作者

Deng, Zhijie, Zhu, Jun

论文摘要

尽管具有理论上的吸引力,但贝叶斯神经网络(BNN)还是在现实世界采用中留下的,这主要是由于对它们的可伸缩性,可访问性和可靠性的持续担忧。在这项工作中,我们开发了贝节式框架以减轻这些问题。特别是,我们建议通过具有成本效益的贝叶斯微调来调整预训练的确定性NNS为变异BNN。从技术上讲,我们开发了一个模块化的实现,用于学习变分BNN,并通过平行示例平行化来翻新通常适用的示例重新聚集技巧,以有效地减少随机变化推断的梯度差异。根据轻巧的贝叶斯学习范式,我们对各种基准进行了广泛的实验,并表明我们的方法可以始终如一地诱导比竞争性基线更高质量的后代,但大大降低了培训开销。代码可在https://github.com/thudzj/scalablebdl上找到。

Despite their theoretical appealingness, Bayesian neural networks (BNNs) are left behind in real-world adoption, mainly due to persistent concerns on their scalability, accessibility, and reliability. In this work, we develop the BayesAdapter framework to relieve these concerns. In particular, we propose to adapt pre-trained deterministic NNs to be variational BNNs via cost-effective Bayesian fine-tuning. Technically, we develop a modularized implementation for the learning of variational BNNs, and refurbish the generally applicable exemplar reparameterization trick through exemplar parallelization to efficiently reduce the gradient variance in stochastic variational inference. Based on the lightweight Bayesian learning paradigm, we conduct extensive experiments on a variety of benchmarks, and show that our method can consistently induce posteriors with higher quality than competitive baselines, yet significantly reducing training overheads. Code is available at https://github.com/thudzj/ScalableBDL.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源