论文标题

具有自我注意力的点云网络更深或更广泛?

Deeper or Wider Networks of Point Clouds with Self-attention?

论文作者

Ran, Haoxi, Lu, Li

论文摘要

由自我注意力驱动的更深层次网络的流行与基于点的基于点的方法鲜明对比。在本文中,我们将群体自我注意力作为构建我们网络的基本障碍:sepnet。我们提出的模块可以有效地捕获本地和全球依赖性。该模块根据组中任何点的加权特征的总和来计算组的特征。为了方便起见,我们概括了群组操作以组装该模块。为了进一步促进我们的网络,我们分别加深了分割和分类任务的隔离,并验证其实用性。具体而言,SEPNET在大多数数据集中实现了针对分类和分割任务的最先进。我们显示的经验证据表明,隔板可以分别从增加的宽度或深度中获得分类或分割的额外准确性。

Prevalence of deeper networks driven by self-attention is in stark contrast to underexplored point-based methods. In this paper, we propose groupwise self-attention as the basic block to construct our network: SepNet. Our proposed module can effectively capture both local and global dependencies. This module computes the features of a group based on the summation of the weighted features of any point within the group. For convenience, we generalize groupwise operations to assemble this module. To further facilitate our networks, we deepen and widen SepNet on the tasks of segmentation and classification respectively, and verify its practicality. Specifically, SepNet achieves state-of-the-art for the tasks of classification and segmentation on most of the datasets. We show empirical evidence that SepNet can obtain extra accuracy in classification or segmentation from increased width or depth, respectively.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源