论文标题

通过加权再培训的深层生成模型的潜在空间中的样品有效优化

Sample-Efficient Optimization in the Latent Space of Deep Generative Models via Weighted Retraining

论文作者

Tripp, Austin, Daxberger, Erik, Hernández-Lobato, José Miguel

论文摘要

科学和工程中的许多重要问题,例如药物设计,涉及优化昂贵的黑盒目标功能,而不是复杂,高维和结构化的输入空间。尽管机器学习技术在解决此类问题方面已经有希望,但现有方法大大缺乏样本效率。我们介绍了一种改进的方法,以进行有效的黑盒优化,该方法在通过深层生成模型学到的低维,连续的潜在歧管中执行优化。与以前的方法相反,我们积极地引导生成模型保持潜在流形,该模型对于有效地优化目标非常有用。我们通过定期在沿优化轨迹查询的数据点上定期检验生成模型,并根据其目标函数值对这些数据点进行加权来实现这一目标。这种加权的再培训可以很容易地在现有方法的基础上实现,并且在经验上证明可以显着提高其在合成和现实世界优化问题上的效率和性能。

Many important problems in science and engineering, such as drug design, involve optimizing an expensive black-box objective function over a complex, high-dimensional, and structured input space. Although machine learning techniques have shown promise in solving such problems, existing approaches substantially lack sample efficiency. We introduce an improved method for efficient black-box optimization, which performs the optimization in the low-dimensional, continuous latent manifold learned by a deep generative model. In contrast to previous approaches, we actively steer the generative model to maintain a latent manifold that is highly useful for efficiently optimizing the objective. We achieve this by periodically retraining the generative model on the data points queried along the optimization trajectory, as well as weighting those data points according to their objective function value. This weighted retraining can be easily implemented on top of existing methods, and is empirically shown to significantly improve their efficiency and performance on synthetic and real-world optimization problems.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源