论文标题

挖掘:在人体上悬垂隐式服装

DIG: Draping Implicit Garment over the Human Body

论文作者

Li, Ren, Guillard, Benoît, Remelli, Edoardo, Fua, Pascal

论文摘要

现有的数据驱动方法用于覆盖人体上的衣服,尽管有效,但无法处理任意拓扑的服装,通常不是端到端的可区分。为了解决这些限制,我们提出了一条端到端可区分管道,该管道用隐式表面表示服装,并学习以铰接式身体模型的形状和姿势参数为条件的皮肤场。为了限制身体的插入和人工制品,我们提出了一种互穿感知的训练数据的预处理策略,并提出了一种新的训练损失,在披上衣服的同时惩罚了自我干扰。我们证明,我们的方法可为服装重建和变形而产生更准确的结果。此外,我们表明我们的方法凭借其端到端的不同性,允许从图像观测值共同恢复身体和服装参数,这是先前工作无法做到的。

Existing data-driven methods for draping garments over human bodies, despite being effective, cannot handle garments of arbitrary topology and are typically not end-to-end differentiable. To address these limitations, we propose an end-to-end differentiable pipeline that represents garments using implicit surfaces and learns a skinning field conditioned on shape and pose parameters of an articulated body model. To limit body-garment interpenetrations and artifacts, we propose an interpenetration-aware pre-processing strategy of training data and a novel training loss that penalizes self-intersections while draping garments. We demonstrate that our method yields more accurate results for garment reconstruction and deformation with respect to state of the art methods. Furthermore, we show that our method, thanks to its end-to-end differentiability, allows to recover body and garments parameters jointly from image observations, something that previous work could not do.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源