论文标题

重新考虑预训练网络在无源域适应中的作用

Rethinking the Role of Pre-Trained Networks in Source-Free Domain Adaptation

论文作者

Zhang, Wenyu, Shen, Li, Foo, Chuan-Sheng

论文摘要

无源域的适应(SFDA)旨在使在完全标记的源域训练的源模型适应未标记的目标域。大数据预训练的网络用于在源训练期间初始化源模型,然后丢弃。但是,源培训可能会导致模型过度拟合以源数据分布并丢失适用的目标域知识。我们建议将预训练的网络集成到目标适应过程中,因为它具有多元化的特征,对概括很重要,并提供了与源模型不同的特征和分类决策的替代视图。我们建议通过共同学习策略来蒸馏出有用的目标域信息,以提高目标伪标记质量以鉴定源模型。 4个基准数据集的评估表明,我们提出的策略改善了适应性的性能,可以成功地与现有的SFDA方法集成。利用在共同学习策略中具有更强表示学习能力的现代预训练的网络进一步提高了绩效。

Source-free domain adaptation (SFDA) aims to adapt a source model trained on a fully-labeled source domain to an unlabeled target domain. Large-data pre-trained networks are used to initialize source models during source training, and subsequently discarded. However, source training can cause the model to overfit to source data distribution and lose applicable target domain knowledge. We propose to integrate the pre-trained network into the target adaptation process as it has diversified features important for generalization and provides an alternate view of features and classification decisions different from the source model. We propose to distil useful target domain information through a co-learning strategy to improve target pseudolabel quality for finetuning the source model. Evaluation on 4 benchmark datasets show that our proposed strategy improves adaptation performance and can be successfully integrated with existing SFDA methods. Leveraging modern pre-trained networks that have stronger representation learning ability in the co-learning strategy further boosts performance.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源