论文标题

3D点云分析的全局分层关注

Global Hierarchical Attention for 3D Point Cloud Analysis

论文作者

Jia, Dan, Hermans, Alexander, Leibe, Bastian

论文摘要

我们提出了一种新的注意机制,称为全球分层注意(GHA),用于3D点云分析。 GHA通过多个层次结构水平上的一系列粗化和插值操作近似于常规的全局点产生关注。 GHA的优势是两个方面。首先,它相对于点数具有线性复杂性,从而使大点云的处理能够处理。其次,GHA本质上具有归纳性偏见,可以专注于空间接近点,同时保留所有点之间的全球连通性。与馈电网络结合使用,可以将GHA插入许多现有的网络体系结构中。我们尝试多个基线网络,并表明添加GHA始终提高不同任务和数据集的性能。对于语义分割的任务,GHA在扫描板上的Minkowskiengine基线增加了1.7%的MIOU。对于3D对象检测任务,GHA将CenterPoint基线提高了Nuscenes数据集上的 +0.5%地图,而3DETR基线在Scannet上提高了 +2.1%MAP25和 +1.5%MAP50。

We propose a new attention mechanism, called Global Hierarchical Attention (GHA), for 3D point cloud analysis. GHA approximates the regular global dot-product attention via a series of coarsening and interpolation operations over multiple hierarchy levels. The advantage of GHA is two-fold. First, it has linear complexity with respect to the number of points, enabling the processing of large point clouds. Second, GHA inherently possesses the inductive bias to focus on spatially close points, while retaining the global connectivity among all points. Combined with a feedforward network, GHA can be inserted into many existing network architectures. We experiment with multiple baseline networks and show that adding GHA consistently improves performance across different tasks and datasets. For the task of semantic segmentation, GHA gives a +1.7% mIoU increase to the MinkowskiEngine baseline on ScanNet. For the 3D object detection task, GHA improves the CenterPoint baseline by +0.5% mAP on the nuScenes dataset, and the 3DETR baseline by +2.1% mAP25 and +1.5% mAP50 on ScanNet.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源