论文标题

长尾视觉识别的嵌套协作学习

Nested Collaborative Learning for Long-Tailed Visual Recognition

论文作者

Li, Jun, Tan, Zichang, Wan, Jun, Lei, Zhen, Guo, Guodong

论文摘要

在长尾数据集上培训的网络尽管有相同的培训设置,但仍显示出长尾学习的极大不确定性。为了减轻不确定性,我们提出了一个嵌套的协作学习(NCL),该学习可以通过协作一起学习多个专家来解决问题。 NCL由两个核心组成部分,分别是嵌套的个体学习(NIL)和嵌套平衡的在线蒸馏(NBOD),它们的重点是每个专家的个人监督学习以及多个专家之间的知识转移。为了更彻底地学习表示形式,零和nbod均以嵌套方式提出,其中学习不仅是从所有类别上从整个角度上进行的,而且还从部分角度进行了一些困难类别。关于部分角度的学习,我们通过使用拟议的硬类别挖掘(HCM)专门选择具有高预测分数作为硬性类别的负面类别。在NCL中,从两个角度的学习是嵌套,高度相关和互补的,并帮助网络不仅捕获全球和健壮的特征,而且还可以捕获细致的区分能力。此外,自我训练进一步用于功能增强。广泛的实验表现出我们方法的优越性,无论是使用单个模型还是合奏,都超过了最先进的实验。

The networks trained on the long-tailed dataset vary remarkably, despite the same training settings, which shows the great uncertainty in long-tailed learning. To alleviate the uncertainty, we propose a Nested Collaborative Learning (NCL), which tackles the problem by collaboratively learning multiple experts together. NCL consists of two core components, namely Nested Individual Learning (NIL) and Nested Balanced Online Distillation (NBOD), which focus on the individual supervised learning for each single expert and the knowledge transferring among multiple experts, respectively. To learn representations more thoroughly, both NIL and NBOD are formulated in a nested way, in which the learning is conducted on not just all categories from a full perspective but some hard categories from a partial perspective. Regarding the learning in the partial perspective, we specifically select the negative categories with high predicted scores as the hard categories by using a proposed Hard Category Mining (HCM). In the NCL, the learning from two perspectives is nested, highly related and complementary, and helps the network to capture not only global and robust features but also meticulous distinguishing ability. Moreover, self-supervision is further utilized for feature enhancement. Extensive experiments manifest the superiority of our method with outperforming the state-of-the-art whether by using a single model or an ensemble.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源