论文标题

随机直接搜索的预期复杂性分析

Expected complexity analysis of stochastic direct-search

论文作者

Dzahini, Kwassi Joseph

论文摘要

这项工作介绍了广泛的定向类型直接搜索方法的随机变体的收敛速率分析。它引入了一种算法,旨在优化可区分的目标函数$ f $,其值只能通过随机嘈杂的黑框进行计算。提出的随机定向直接搜索(SDDS)算法通过对相应的不可用的目标函数值的所谓概率估计值施加足够的降低条件来接受新的迭代。需要使用足够大但固定的概率$β$保持此类估计的准确性。对该方法的分析利用了现有的基于超级马丁的框架,用于使用自适应步骤大小的随机优化方法的收敛速率分析。它的目的是表明,在给定阈值$ε$以下的梯度规范所需的预期迭代次数以$ \ Mathcal {o} \ left(ε^{\ frac {-p} {\ frac {-p} {\ min(p-1,1)}}}}/(2β-1)\ right(ε^{\ frac {-p} {\ frac {-p} {-p} {\ frac {-p} {与使用上述相同框架(例如随机信任区域方法和随机线搜索方法的框架)的先前分析不同,SDD不使用任何梯度信息来查找下降方向。但是,其收敛率类似于两种后一种方法的收敛率,这些方法依赖于$ε$,这也与广泛的确定性定向直接搜索方法相匹配,这些定向直接搜索方法通过施加足够的减少条件来接受新的迭代。

This work presents the convergence rate analysis of stochastic variants of the broad class of direct-search methods of directional type. It introduces an algorithm designed to optimize differentiable objective functions $f$ whose values can only be computed through a stochastically noisy blackbox. The proposed stochastic directional direct-search (SDDS) algorithm accepts new iterates by imposing a sufficient decrease condition on so called probabilistic estimates of the corresponding unavailable objective function values. The accuracy of such estimates is required to hold with a sufficiently large but fixed probability $β$. The analysis of this method utilizes an existing supermartingale-based framework proposed for the convergence rates analysis of stochastic optimization methods that use adaptive step sizes. It aims to show that the expected number of iterations required to drive the norm of the gradient of $f$ below a given threshold $ε$ is bounded in $\mathcal{O}\left(ε^{\frac{-p}{\min(p-1,1)}}/(2β-1)\right)$ with $p>1$. Unlike prior analysis using the same aforementioned framework such as those of stochastic trust-region methods and stochastic line search methods, SDDS does not use any gradient information to find descent directions. However, its convergence rate is similar to those of both latter methods with a dependence on $ε$ that also matches that of the broad class of deterministic directional direct-search methods which accept new iterates by imposing a sufficient decrease condition.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源