Chebyshev Inertial Iteration for Accelerating Fixed-Point Iterations

10 Jan 2020  ·  Wadayama Tadashi, Takabe Satoshi ·

A novel method which is called the Chebyshev inertial iteration for accelerating the convergence speed of fixed-point iterations is presented. The Chebyshev inertial iteration can be regarded as a valiant of the successive over relaxation or Krasnosel'ski\v{\i}-Mann iteration utilizing the inverse of roots of a Chebyshev polynomial as iteration dependent inertial factors. One of the most notable features of the proposed method is that it can be applied to nonlinear fixed-point iterations in addition to linear fixed-point iterations. Linearization around the fixed point is the key for the analysis on the local convergence rate of the proposed method. The proposed method appears effective in particular for accelerating the proximal gradient methods such as ISTA. It is also proved that the proposed method can successfully accelerate almost any fixed-point iterations if all the eigenvalues of the Jacobian at the fixed point are real.

PDF Abstract
No code implementations yet. Submit your code now

Categories


Optimization and Control Information Theory Information Theory