论文标题

对抗性对比度自我监督学习

Adversarial Contrastive Self-Supervised Learning

论文作者

Zhu, Wentao, Shang, Hang, Lv, Tingxun, Liao, Chao, Yang, Sen, Liu, Ji

论文摘要

最近,从广泛的未标记数据中学习,尤其是自我监督的学习,一直在出现并引起了广泛的关注。在一些标记的示例上进行自我监督的学习,然后在一些标记的示例上进行微调,可以显着提高标签效率,并使用完全注释的数据超过标准监督培训。在这项工作中,我们基于在线硬性矿山挖掘的新颖自我监督深度学习范式。具体来说,我们设计了一个学生教师网络,以生成用于自学学习的数据的多视图,并将硬性负面对挖矿整合到培训中。然后,考虑到阳性样品对和开采的硬性样品对,我们得出了新的三胞胎样损失。广泛的实验证明了该方法在ILSVRC-2012上的有效性及其成分的有效性。

Recently, learning from vast unlabeled data, especially self-supervised learning, has been emerging and attracted widespread attention. Self-supervised learning followed by the supervised fine-tuning on a few labeled examples can significantly improve label efficiency and outperform standard supervised training using fully annotated data. In this work, we present a novel self-supervised deep learning paradigm based on online hard negative pair mining. Specifically, we design a student-teacher network to generate multi-view of the data for self-supervised learning and integrate hard negative pair mining into the training. Then we derive a new triplet-like loss considering both positive sample pairs and mined hard negative sample pairs. Extensive experiments demonstrate the effectiveness of the proposed method and its components on ILSVRC-2012.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源