论文标题

低遗憾的二进制抽样方法,用于有效地全局优化单变量功能

Low Regret Binary Sampling Method for Efficient Global Optimization of Univariate Functions

论文作者

Gokcesu, Kaan, Gokcesu, Hakan

论文摘要

在这项工作中,我们为单变量损耗函数的全局优化问题提出了一种计算有效算法。为了进行性能评估,我们研究了算法的累积遗憾,而不是我们最佳查询和目标函数最佳价值之间的简单遗憾。尽管我们的方法使用传统的低量算法(例如Piyavskii-shubert方法)具有类似的遗憾结果,用于Lipschitz的连续或Lipschitz平滑功能,但它具有很大的计算成本优势。在Piyavskii-Shubert方法中,对于某些类型的功能,可能很难确定查询点(因为它们是解决其他优化问题的解决方案)。但是,在我们的二进制采样方法中阐明了这个问题,在我们的二进制采样方法中,采样集都是预定的,无论功能特征如何。对于$ [0,1] $的搜索空间,我们的方法最多具有$ l \ log(3t)$和$ 2.25h $遗憾的$ l $ -lipschitz连续和$ h $ -lipschitz平滑功能。我们还可以分析地扩展我们的结果,以涵盖更复杂的规律性条件的更广泛的功能。

In this work, we propose a computationally efficient algorithm for the problem of global optimization in univariate loss functions. For the performance evaluation, we study the cumulative regret of the algorithm instead of the simple regret between our best query and the optimal value of the objective function. Although our approach has similar regret results with the traditional lower-bounding algorithms such as the Piyavskii-Shubert method for the Lipschitz continuous or Lipschitz smooth functions, it has a major computational cost advantage. In Piyavskii-Shubert method, for certain types of functions, the query points may be hard to determine (as they are solutions to additional optimization problems). However, this issue is circumvented in our binary sampling approach, where the sampling set is predetermined irrespective of the function characteristics. For a search space of $[0,1]$, our approach has at most $L\log (3T)$ and $2.25H$ regret for $L$-Lipschitz continuous and $H$-Lipschitz smooth functions respectively. We also analytically extend our results for a broader class of functions that covers more complex regularity conditions.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源