Nonsmooth Convex Optimization using the Specular Gradient Method with Root-Linear Convergence
In this paper, we find the special case of the subgradient method minimizing a one-dimensional real-valued function, which we term the specular gradient method, that converges root-linearly without any additional assumptions except the convexity. Furthermore, we suggest a way to implement the specular gradient method without explicitly calculating specular derivatives.
PDF AbstractCode
Categories
Optimization and Control
Numerical Analysis
Numerical Analysis
90C25, 49J52, 65K05, 26A27