论文标题
使用自适应聚类变压器端到端对象检测
End-to-End Object Detection with Adaptive Clustering Transformer
论文作者
论文摘要
使用变压器(DETR)的端到端对象检测提议使用变压器执行对象检测,并使用诸如快速RCNN之类的两个阶段对象检测实现可比较的性能。但是,由于高分辨率的空间输入,DETR需要大量的计算资源来进行培训和推理。在本文中,已经提出了一种名为自适应聚类变压器(ACT)的新型变压器变体,以降低高分辨率输入的计算成本。 ACT群集使用局部性敏感哈希(LSH)自适应地进行查询特征,并使用原型键相互作用进行查询键相互作用。 ACT可以将自我注意力内部的二次O(N2)复杂性降低到O(nk)中,其中k是每一层原型的数量。 ACT可以是一个无需任何培训即可取代原始自我发项模块的倒入模块。 ACT在准确性和计算成本(FLOP)之间取得了良好的平衡。该代码可作为易于实验复制和验证的补充。代码在\ url {https://github.com/gaopengcuhk/smca-detr/}发行
End-to-end Object Detection with Transformer (DETR)proposes to perform object detection with Transformer and achieve comparable performance with two-stage object detection like Faster-RCNN. However, DETR needs huge computational resources for training and inference due to the high-resolution spatial input. In this paper, a novel variant of transformer named Adaptive Clustering Transformer(ACT) has been proposed to reduce the computation cost for high-resolution input. ACT cluster the query features adaptively using Locality Sensitive Hashing (LSH) and ap-proximate the query-key interaction using the prototype-key interaction. ACT can reduce the quadratic O(N2) complexity inside self-attention into O(NK) where K is the number of prototypes in each layer. ACT can be a drop-in module replacing the original self-attention module without any training. ACT achieves a good balance between accuracy and computation cost (FLOPs). The code is available as supplementary for the ease of experiment replication and verification. Code is released at \url{https://github.com/gaopengcuhk/SMCA-DETR/}