Generalized Damped Newton Algorithms in Nonsmooth Optimization via Second-Order Subdifferentials

26 Jan 2021  ·  Pham Duy Khanh, Boris Mordukhovich, Vo Thanh Phat, Dat Ba Tran ·

The paper proposes and develops new globally convergent algorithms of the generalized damped Newton type for solving important classes of nonsmooth optimization problems. These algorithms are based on the theory and calculations of second-order subdifferentials of nonsmooth functions with employing the machinery of second-order variational analysis and generalized differentiation. First we develop a globally superlinearly convergent damped Newton-type algorithm for the class of continuously differentiable functions with Lipschitzian gradients, which are nonsmooth of second order. Then we design such a globally convergent algorithm to solve a structured class of nonsmooth quadratic composite problems with extended-real-valued cost functions, which typically arise in machine learning and statistics. Finally, we present the results of numerical experiments and compare the performance of our main algorithm applied to an important class of Lasso problems with those achieved by other first-order and second-order optimization algorithms.

PDF Abstract

Categories


Optimization and Control