Nonsmooth Convex Optimization using the Specular Gradient Method with Root-Linear Convergence

30 Dec 2024  ·  Kiyuob Jung, Jehan Oh ·

In this paper, we find the special case of the subgradient method minimizing a one-dimensional real-valued function, which we term the specular gradient method, that converges root-linearly without any additional assumptions except the convexity. Furthermore, we suggest a way to implement the specular gradient method without explicitly calculating specular derivatives.

PDF Abstract

Categories


Optimization and Control Numerical Analysis Numerical Analysis 90C25, 49J52, 65K05, 26A27