论文标题

EGO2HANDSPOSE:用于Egentric双手3D全局姿势估计的数据集

Ego2HandsPose: A Dataset for Egocentric Two-hand 3D Global Pose Estimation

论文作者

Lin, Fanqing, Martinez, Tony

论文摘要

在全球坐标系中,基于颜色的双手3D姿势估计在许多应用中至关重要。但是,很少有专门用于此任务的数据集,并且没有现有的数据集支持在非实验室环境中的估计。这在很大程度上归因于3D手姿势注释所需的复杂数据收集过程,这也导致难以获得野外估算所需的视觉多样性水平的实例。为了实现这一目标,最近提出了一个大规模的数据集EGO2HANDS来解决野外双手分割和检测的任务。提出的基于组成的数据生成技术可以创建具有质量,数量和多样性的双手实例,这些实例可以很好地推广到看不见的域。在这项工作中,我们提出了EGO2HANDSPOSE,这是包含3D手姿势注释的EGO2HAND的扩展,并且是第一个在看不见域中启用基于颜色的双手3D跟踪的数据集。为此,我们开发了一组参数拟合算法以启用1)使用单个图像的3D手姿势注释,2)自动转换从2D到3D手姿势和3)具有时间一致性的准确的两手跟踪。我们在多阶段管道上提供了增量的定量分析,并表明我们数据集中的培训达到了最新的结果,这些结果大大优于其他数据集,用于以自我为中心的双手全球3D姿势估计。

Color-based two-hand 3D pose estimation in the global coordinate system is essential in many applications. However, there are very few datasets dedicated to this task and no existing dataset supports estimation in a non-laboratory environment. This is largely attributed to the sophisticated data collection process required for 3D hand pose annotations, which also leads to difficulty in obtaining instances with the level of visual diversity needed for estimation in the wild. Progressing towards this goal, a large-scale dataset Ego2Hands was recently proposed to address the task of two-hand segmentation and detection in the wild. The proposed composition-based data generation technique can create two-hand instances with quality, quantity and diversity that generalize well to unseen domains. In this work, we present Ego2HandsPose, an extension of Ego2Hands that contains 3D hand pose annotation and is the first dataset that enables color-based two-hand 3D tracking in unseen domains. To this end, we develop a set of parametric fitting algorithms to enable 1) 3D hand pose annotation using a single image, 2) automatic conversion from 2D to 3D hand poses and 3) accurate two-hand tracking with temporal consistency. We provide incremental quantitative analysis on the multi-stage pipeline and show that training on our dataset achieves state-of-the-art results that significantly outperforms other datasets for the task of egocentric two-hand global 3D pose estimation.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源