论文标题

emoca:情绪驱动的单眼脸和动画

EMOCA: Emotion Driven Monocular Face Capture and Animation

论文作者

Danecek, Radek, Black, Michael J., Bolkart, Timo

论文摘要

随着3D面部化身越来越广泛地用于交流,至关重要的是他们忠实地传达情感。不幸的是,从单眼图像中回归参数3D模型的最佳最新方法无法捕获整个面部表达范围,例如微妙或极端的情绪。我们发现用于训练的标准重建指标(具有里程碑意义的再投影误差,光度误差和面部识别损失)不足以捕获高保真表达式。结果是面部几何形状与输入图像的情感内容不符。我们通过在训练过程中引入一种新颖的深层感知情绪一致性损失来解决emoca(情感捕获和动画),这有助于确保重建的3D表达与输入图像中描述的表达相匹配。尽管EMOCA达到了与当前最佳方法相提并论的3D重建错误,但它在重建表达和感知的情感内容方面大大优于它们。我们还直接回归价和唤醒水平,并从估计的3D面参数中对基本表达式进行分类。在野外情感识别的任务上,我们纯粹的几何方法与最佳的基于图像的方法相提并论,突出了3D几何在分析人类行为中的价值。该模型和代码可在https://emoca.is.tue.mpg.de上公开获取。

As 3D facial avatars become more widely used for communication, it is critical that they faithfully convey emotion. Unfortunately, the best recent methods that regress parametric 3D face models from monocular images are unable to capture the full spectrum of facial expression, such as subtle or extreme emotions. We find the standard reconstruction metrics used for training (landmark reprojection error, photometric error, and face recognition loss) are insufficient to capture high-fidelity expressions. The result is facial geometries that do not match the emotional content of the input image. We address this with EMOCA (EMOtion Capture and Animation), by introducing a novel deep perceptual emotion consistency loss during training, which helps ensure that the reconstructed 3D expression matches the expression depicted in the input image. While EMOCA achieves 3D reconstruction errors that are on par with the current best methods, it significantly outperforms them in terms of the quality of the reconstructed expression and the perceived emotional content. We also directly regress levels of valence and arousal and classify basic expressions from the estimated 3D face parameters. On the task of in-the-wild emotion recognition, our purely geometric approach is on par with the best image-based methods, highlighting the value of 3D geometry in analyzing human behavior. The model and code are publicly available at https://emoca.is.tue.mpg.de.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源