论文标题

班级增量学习的自定进度不平衡纠正

Self-Paced Imbalance Rectification for Class Incremental Learning

论文作者

Liu, Zhiheng, Zhu, Kai, Cao, Yang

论文摘要

基于示例性的课堂学习是识别新类的同时不忘记旧类,其样本只能保存在有限的内存中。新样本与旧样本的比率波动是由不同环境下的记忆容量变化引起的,它将带来挑战以稳定增量优化过程。为了解决这个问题,我们提出了一种新型的自定进度不平衡整流方案,该方案在表示阶段动态保持增量平衡。具体而言,我们提出的方案包括一种频率补偿策略,该策略以相应的数字比来调整旧类和新类之间的逻辑余量,以增强旧类的表达能力,以及通过估算旧嵌入空间中不同类别的相似性来减少表示混淆的遗传传递策略。此外,还提出了按时间顺序的衰减机制来减轻旧类以多个逐步增量的重复优化。对三个基准测试的广泛实验表明稳定的增量性能,显着优于最新方法。

Exemplar-based class-incremental learning is to recognize new classes while not forgetting old ones, whose samples can only be saved in limited memory. The ratio fluctuation of new samples to old exemplars, which is caused by the variation of memory capacity at different environments, will bring challenges to stabilize the incremental optimization process. To address this problem, we propose a novel self-paced imbalance rectification scheme, which dynamically maintains the incremental balance during the representation learning phase. Specifically, our proposed scheme consists of a frequency compensation strategy that adjusts the logits margin between old and new classes with the corresponding number ratio to strengthen the expression ability of the old classes, and an inheritance transfer strategy to reduce the representation confusion by estimating the similarity of different classes in the old embedding space. Furthermore, a chronological attenuation mechanism is proposed to mitigate the repetitive optimization of the older classes at multiple step-wise increments. Extensive experiments on three benchmarks demonstrate stable incremental performance, significantly outperforming the state-of-the-art methods.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源