论文标题

修剪和曲调合奏:稀疏独立子网的低成本合奏学习

Prune and Tune Ensembles: Low-Cost Ensemble Learning With Sparse Independent Subnetworks

论文作者

Whitaker, Tim, Whitley, Darrell

论文摘要

合奏学习是改善机器学习概括的有效方法。但是,随着最新的神经网络的增长,与培训几个独立网络相关的计算成本变得昂贵。我们引入了一种快速,低成本的方法,用于创建神经网络的多样化集合,而无需从头开始训练多个模型。我们首先培训单亲网络来做到这一点。然后,我们通过克隆父母来创建儿童网络,并急剧修剪每个孩子的参数,以创建具有独特和多样化的拓扑成员的合奏。然后,我们短暂地训练每个儿童网络的少数时期,现在与从头开始的训练相比,它们的收敛速度明显更快。我们探索了各种方法,以最大程度地提高儿童网络多样性,包括使用抗随机修剪和单周期调整。这种多样性使“修剪和调子”合奏能够以培训成本的一小部分与传统合奏竞争的结果。我们对最先进的低成本集合方法进行基准测试方法,并显示出对CIFAR-10和CIFAR-100的准确性和不确定性估计的明显改善。

Ensemble Learning is an effective method for improving generalization in machine learning. However, as state-of-the-art neural networks grow larger, the computational cost associated with training several independent networks becomes expensive. We introduce a fast, low-cost method for creating diverse ensembles of neural networks without needing to train multiple models from scratch. We do this by first training a single parent network. We then create child networks by cloning the parent and dramatically pruning the parameters of each child to create an ensemble of members with unique and diverse topologies. We then briefly train each child network for a small number of epochs, which now converge significantly faster when compared to training from scratch. We explore various ways to maximize diversity in the child networks, including the use of anti-random pruning and one-cycle tuning. This diversity enables "Prune and Tune" ensembles to achieve results that are competitive with traditional ensembles at a fraction of the training cost. We benchmark our approach against state of the art low-cost ensemble methods and display marked improvement in both accuracy and uncertainty estimation on CIFAR-10 and CIFAR-100.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源