论文标题

3D人网状回归与密集对应关系

3D Human Mesh Regression with Dense Correspondence

论文作者

Zeng, Wang, Ouyang, Wanli, Luo, Ping, Liu, Wentao, Wang, Xiaogang

论文摘要

从单个2D图像中估算人体的3D网格是重要的任务,其中许多应用,例如增强现实和人类机器人相互作用。但是,先前的工作从使用卷积神经网络(CNN)提取的全局图像特征重建了3D网格,其中缺少网格表面和图像像素之间的密集对应关系,从而导致次优溶液。本文提出了一个名为DECOMR的无模型3D人网估计框架,该框架明确建立了网格和局部图像在紫外线空间中特征之间的密集对应关系(即用于3D网格的纹理映射的2D空间)。 DECOMR首先预测像素到表面密集的对应图(即IUV图像),我们将局部特征从图像空间传输到UV空间。然后,在紫外线空间中处理了传输的本地图像特征,以回归位置图,该位置图与传输的功能很好地对齐。最后,我们从回归的位置图中重建3D人网,并具有预定义的映射函数。我们还观察到,现有的不连续紫外线图对网络学习不友好。因此,我们提出了一个新型的紫外线图,该图将原始网格表面上的大多数邻近关系保持。实验表明,我们提出的本地特征对齐和连续的UV MAP在多个公共基准上优于现有的3D网格方法。代码将在https://github.com/zengwang430521/decomr上提供

Estimating 3D mesh of the human body from a single 2D image is an important task with many applications such as augmented reality and Human-Robot interaction. However, prior works reconstructed 3D mesh from global image feature extracted by using convolutional neural network (CNN), where the dense correspondences between the mesh surface and the image pixels are missing, leading to suboptimal solution. This paper proposes a model-free 3D human mesh estimation framework, named DecoMR, which explicitly establishes the dense correspondence between the mesh and the local image features in the UV space (i.e. a 2D space used for texture mapping of 3D mesh). DecoMR first predicts pixel-to-surface dense correspondence map (i.e., IUV image), with which we transfer local features from the image space to the UV space. Then the transferred local image features are processed in the UV space to regress a location map, which is well aligned with transferred features. Finally we reconstruct 3D human mesh from the regressed location map with a predefined mapping function. We also observe that the existing discontinuous UV map are unfriendly to the learning of network. Therefore, we propose a novel UV map that maintains most of the neighboring relations on the original mesh surface. Experiments demonstrate that our proposed local feature alignment and continuous UV map outperforms existing 3D mesh based methods on multiple public benchmarks. Code will be made available at https://github.com/zengwang430521/DecoMR

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源