On the convergence rate of the scaled proximal decomposition on the graph of a maximal monotone operator (SPDG) algorithm

29 Nov 2017  ·  Lima S. Costa, Alves M. Marques ·

Relying on fixed point techniques, Mahey, Oualibouch and Tao introduced the scaled proximal decomposition on the graph of a maximal monotone operator (SPDG) algorithm and analyzed its performance on inclusions for strongly monotone and Lipschitz continuous operators. The SPDG algorithm generalizes the Spingarn's partial inverse method by allowing scaling factors, a key strategy to speed up the convergence of numerical algorithms. In this note, we show that the SPDG algorithm can alternatively be analyzed by means of the original Spingarn's partial inverse framework, tracing back to the 1983 Spingarn's paper. We simply show that under the assumptions considered by Mahey, Oualibouch and Tao, the Spingarn's partial inverse of the underlying maximal monotone operator is strongly monotone, which allows one to employ recent results on the convergence and iteration-complexity of proximal point type methods for strongly monotone operators. By doing this, we additionally obtain a potentially faster convergence for the SPDG algorithm and a more accurate upper bound on the number of iterations needed to achieve prescribed tolerances, specially on ill-conditioned problems.

PDF Abstract
No code implementations yet. Submit your code now

Categories


Optimization and Control