论文标题

通过双向交叉注意变压器适应域

Domain Adaptation via Bidirectional Cross-Attention Transformer

论文作者

Wang, Xiyu, Guo, Pengxin, Zhang, Yu

论文摘要

域的适应性(DA)旨在利用从源域中从源域中获得的知识,并具有充分的标记数据到目标域,仅使用未标记的数据。关于DA的大多数研究通过基于基于卷积的神经网络最小化域间隙来最大程度地减少域间隙。最近,视觉变形金刚在多个视力任务中显着提高了性能。基于视觉变压器,在本文中,我们提出了一个双向跨意义变压器(BCAT),以提高性能。在拟议的BCAT中,注意机制可以提取隐式源和目标混合特征表示,以缩小域差异。具体而言,在BCAT中,我们设计了一个具有双向交叉注意力的重量分支分支变压器,以学习域不变特征表示。广泛的实验表明,所提出的BCAT模型比基于卷积或变形金刚的现有最新DA方法,在四个基准数据集上实现了卓越的性能。

Domain Adaptation (DA) aims to leverage the knowledge learned from a source domain with ample labeled data to a target domain with unlabeled data only. Most existing studies on DA contribute to learning domain-invariant feature representations for both domains by minimizing the domain gap based on convolution-based neural networks. Recently, vision transformers significantly improved performance in multiple vision tasks. Built on vision transformers, in this paper we propose a Bidirectional Cross-Attention Transformer (BCAT) for DA with the aim to improve the performance. In the proposed BCAT, the attention mechanism can extract implicit source and target mixup feature representations to narrow the domain discrepancy. Specifically, in BCAT, we design a weight-sharing quadruple-branch transformer with a bidirectional cross-attention mechanism to learn domain-invariant feature representations. Extensive experiments demonstrate that the proposed BCAT model achieves superior performance on four benchmark datasets over existing state-of-the-art DA methods that are based on convolutions or transformers.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源