Quadratic error bound of the smoothed gap and the restarted averaged primal-dual hybrid gradient

We study the linear convergence of the primal-dual hybrid gradient method. After a review of current analyses, we show that they do not explain properly the behavior of the algorithm, even on the most simple problems. We thus introduce the quadratic error bound of the smoothed gap, a new regularity...

Full description

Saved in:
Bibliographic Details
Main Author: Fercoq, Olivier
Format: Article
Language:English
Published: Université de Montpellier 2023-08-01
Series:Open Journal of Mathematical Optimization
Subjects:
Online Access:https://ojmo.centre-mersenne.org/articles/10.5802/ojmo.26/
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1825204974844379136
author Fercoq, Olivier
author_facet Fercoq, Olivier
author_sort Fercoq, Olivier
collection DOAJ
description We study the linear convergence of the primal-dual hybrid gradient method. After a review of current analyses, we show that they do not explain properly the behavior of the algorithm, even on the most simple problems. We thus introduce the quadratic error bound of the smoothed gap, a new regularity assumption that holds for a wide class of optimization problems. Equipped with this tool, we manage to prove tighter convergence rates. Then, we show that averaging and restarting the primal-dual hybrid gradient allows us to leverage better the regularity constant. Numerical experiments on linear and quadratic programs, ridge regression and image denoising illustrate the findings of the paper.
format Article
id doaj-art-62f117ff8f404b5c86528e7f01f799fa
institution Kabale University
issn 2777-5860
language English
publishDate 2023-08-01
publisher Université de Montpellier
record_format Article
series Open Journal of Mathematical Optimization
spelling doaj-art-62f117ff8f404b5c86528e7f01f799fa2025-02-07T14:02:56ZengUniversité de MontpellierOpen Journal of Mathematical Optimization2777-58602023-08-01413410.5802/ojmo.2610.5802/ojmo.26Quadratic error bound of the smoothed gap and the restarted averaged primal-dual hybrid gradientFercoq, Olivier0LTCI, Télécom Paris, Institut Polytechnique de Paris, FranceWe study the linear convergence of the primal-dual hybrid gradient method. After a review of current analyses, we show that they do not explain properly the behavior of the algorithm, even on the most simple problems. We thus introduce the quadratic error bound of the smoothed gap, a new regularity assumption that holds for a wide class of optimization problems. Equipped with this tool, we manage to prove tighter convergence rates. Then, we show that averaging and restarting the primal-dual hybrid gradient allows us to leverage better the regularity constant. Numerical experiments on linear and quadratic programs, ridge regression and image denoising illustrate the findings of the paper.https://ojmo.centre-mersenne.org/articles/10.5802/ojmo.26/linear convergenceprimal-dual algorithmerror boundrestart
spellingShingle Fercoq, Olivier
Quadratic error bound of the smoothed gap and the restarted averaged primal-dual hybrid gradient
Open Journal of Mathematical Optimization
linear convergence
primal-dual algorithm
error bound
restart
title Quadratic error bound of the smoothed gap and the restarted averaged primal-dual hybrid gradient
title_full Quadratic error bound of the smoothed gap and the restarted averaged primal-dual hybrid gradient
title_fullStr Quadratic error bound of the smoothed gap and the restarted averaged primal-dual hybrid gradient
title_full_unstemmed Quadratic error bound of the smoothed gap and the restarted averaged primal-dual hybrid gradient
title_short Quadratic error bound of the smoothed gap and the restarted averaged primal-dual hybrid gradient
title_sort quadratic error bound of the smoothed gap and the restarted averaged primal dual hybrid gradient
topic linear convergence
primal-dual algorithm
error bound
restart
url https://ojmo.centre-mersenne.org/articles/10.5802/ojmo.26/
work_keys_str_mv AT fercoqolivier quadraticerrorboundofthesmoothedgapandtherestartedaveragedprimaldualhybridgradient