For Better Performance Please Use Chrome or Firefox Web Browser

A new method based on the proximal bundle idea and gradient sampling technique for minimizing nonsmooth convex functions

Journal: Computational Optimization and Applications (Springer)

Abstract
In this paper, we combine the positive aspects of the gradient sampling (GS) and bundle methods, as the most efficient methods in nonsmooth optimization, to develop a robust method for solving unconstrained nonsmooth convex optimization problems. The main aim of the proposed method is to take advantage of both GS and bundle methods, meanwhile avoiding their drawbacks. At each iteration of this method, to find an efficient descent direction, the GS technique is utilized for constructing a local polyhedral model for the objective function. If necessary, via an iterative improvement process, this initial polyhedral model is improved by some techniques inspired by the bundle and GS methods. The convergence of the method is studied, which reveals that the global convergence property of our method is independent of the number of gradient evaluations required to establish and improve the initial polyhedral models. Thus, the presented method needs much fewer gradient evaluations in comparison to the original GS method. Furthermore, by means of numerical simulations, we show that the presented method provides promising results in comparison with GS methods, especially for large scale problems. Moreover, in contrast with some bundle methods, our method is not very sensitive to the accuracy of supplied gradients.
 

Journal Papers
ماه: 
July
سال: 
2020

تحت نظارت وف ایرانی