论文标题
风格您的头发:通过局部风格的头发对齐的姿势不变发型转移的潜在优化
Style Your Hair: Latent Optimization for Pose-Invariant Hairstyle Transfer via Local-Style-Aware Hair Alignment
论文作者
论文摘要
由于发型的复杂性和美味,编辑发型是独一无二的,而且具有挑战性。尽管最近的方法显着改善了头发的细节,但是当源图像的姿势与目标头发图像的姿势大不相同时,这些模型通常会产生不良的输出,从而限制了其实际应用。 Hairfit是一种姿势不变的发型转移模型,可以减轻此限制,但在保留精致的头发纹理方面仍然表现出不令人满意的质量。为了解决这些局限性,我们提出了一个具有潜在优化和新呈现的局部匹配损失的高性能姿势不变的发型转移模型。在stylegan2潜在空间中,我们首先探索了目标头发的姿势对准的潜在代码,并根据本地风格匹配保留了详细的纹理。然后,我们的模型对源的遮挡构成了对齐的目标头发的遮挡,并将两个图像混合在一起以产生最终输出。实验结果表明,我们的模型在在较大的姿势差异和保留本地发型纹理下转移发型方面具有优势。
Editing hairstyle is unique and challenging due to the complexity and delicacy of hairstyle. Although recent approaches significantly improved the hair details, these models often produce undesirable outputs when a pose of a source image is considerably different from that of a target hair image, limiting their real-world applications. HairFIT, a pose-invariant hairstyle transfer model, alleviates this limitation yet still shows unsatisfactory quality in preserving delicate hair textures. To solve these limitations, we propose a high-performing pose-invariant hairstyle transfer model equipped with latent optimization and a newly presented local-style-matching loss. In the StyleGAN2 latent space, we first explore a pose-aligned latent code of a target hair with the detailed textures preserved based on local style matching. Then, our model inpaints the occlusions of the source considering the aligned target hair and blends both images to produce a final output. The experimental results demonstrate that our model has strengths in transferring a hairstyle under larger pose differences and preserving local hairstyle textures.