For Better Performance Please Use Chrome or Firefox Web Browser

A Gradient Sampling Method Based on Ideal Direction for Solving Nonsmooth Optimization Problems

Journal: Journal of Optimization Theory and Applications (Springer)

Abstract
In this paper, a modification to the original gradient sampling method for minimizing nonsmooth nonconvex functions is presented. One computational component in the gradient sampling method is the need to solve a quadratic optimization problem at each iteration, which may result in a time-consuming process, especially for largescale objectives. To resolve this difficulty, this study proposes a new descent direction, for which there is no need to consider any quadratic or linear subproblem. It is shown that this direction satisfies the Armijo step size condition. We also prove that under proposed modifications, the global convergence of the gradient sampling method is preserved. Moreover, under some moderate assumptions, an upper bound for the number of serious iterations is presented. Using this upper bound, we develop a different strategy to study the convergence of the method. We also demonstrate the efficiency of the proposed method using small-, medium- and large-scale problems in our numerical experiments.

 

Journal Papers
Month/Season: 
August
Year: 
2020

ارتقاء امنیت وب با وف ایرانی