论文标题
三分步长梯度方法,带有放松的广义Armijo步长规则
Three-point Step Size Gradient Method with Relaxed Generalized Armijo Step Size Rule
论文作者
论文摘要
基于在最近的三个迭代中的点差异和梯度的差异,以及泰勒定理,在最近的迭代中构建了两种形式的准Newton方程。通过使用两种形式的准Newton方程和最小二乘的方法,提出了解决无约束优化问题的三点步长梯度方法。通过使用宽松的广义ARMIJO步长规则证明,如果梯度函数均匀连续,新方法是全局收敛属性的。此外,结果表明,当目标函数是伪符号(Quasi-convex)函数时,新方法具有很强的收敛结果。另外,在一些合适的假设下还表明了新方法是超线性和线性收敛的。尽管使用了Multi-Piont信息,但TBB具有简单性,低内存需求和仅使用一阶信息的特征,新方法非常适合解决大规模优化问题。提供了数值实验,并确认了TBB的效率,鲁棒性和分析。
Based on differences of points and differences of gradients over the most recent three iterations, together with the Taylor's theorem, two forms of the quasi-Newton equations at the recent iteration are constructed. By using the two forms of the quasi-Newton equation and the method of least squares, three-point step size gradient methods for solving unconstrained optimization problem are proposed. It is proved by using the relaxed generalized Armijo step size rule that the new method is of global convergence properties if the gradient function is uniformly continuous. Moreover, it is shown that, when the objective function is pseudo-convex (quasi-convex) function, the new method has strong convergence results. In addition, it is also shown under some suitable assumptions that the new method is of super-linear and linear convergence. Although multi-piont information is used, TBB has the feature of simplicity, low memory requirement and only first order information being used, the new method is very suitable for solving large-scale optimization problems. Numerical experiments are provided and the efficiency, robustness and analysis of TBB are confirmed.