Restarted Halpern PDHG for Linear Programming

23 Jul 2024  ·  Haihao Lu, Jinwen Yang ·

In this paper, we propose and analyze a new matrix-free primal-dual algorithm, called restarted Halpern primal-dual hybrid gradient (rHPDHG), for solving linear programming (LP). We show that rHPDHG can achieve optimal accelerated linear convergence on feasible and bounded LP. Furthermore, we present a refined analysis that demonstrates an accelerated two-stage convergence of rHPDHG over the vanilla PDHG with an improved complexity for identification and an accelerated eventual linear convergence that does not depend on the conservative global Hoffman constant. Regarding infeasible LP, we show that rHPDHG can recover infeasibility certificates with an accelerated linear rate, improving the previous convergence rates. Furthermore, we discuss an extension of rHPDHG by adding reflection operation (which is dubbed as $\mathrm{r^2HPDHG}$), and demonstrate that it shares all theoretical guarantees of rHPDHG with an additional factor of 2 speedup in the complexity bound. Lastly, we build up a GPU-based LP solver using rHPDHG/$\mathrm{r^2HPDHG}$, and the experiments on 383 MIPLIB instances showcase an improved numerical performance compared cuPDLP.jl.

PDF Abstract

Categories


Optimization and Control