论文标题
线性注意机制:有效的语义分割注意力
Linear Attention Mechanism: An Efficient Attention for Semantic Segmentation
论文作者
论文摘要
在本文中,为了弥补这种缺陷,我们提出了一种线性注意机制,该机制大概是在记忆和计算成本较少的情况下点dot-topruct uthing。高效的设计使注意机制和神经网络之间的合并更加灵活和多功能。对语义分割进行的实验证明了线性注意机制的有效性。代码可在https://github.com/lironui/linear-ctithention-mechanism上获得。
In this paper, to remedy this deficiency, we propose a Linear Attention Mechanism which is approximate to dot-product attention with much less memory and computational costs. The efficient design makes the incorporation between attention mechanisms and neural networks more flexible and versatile. Experiments conducted on semantic segmentation demonstrated the effectiveness of linear attention mechanism. Code is available at https://github.com/lironui/Linear-Attention-Mechanism.