Résumé
The proximal gradient and its variants is one of the most attractive first-order algorithm for minimizing the sum of two convex functions, with one being nonsmooth. However, it requires the differentiable part of the objective to have a Lipschitz continuous gradient, thus precluding its use in many applications. In this paper we introduce a framework which allows to circumvent the intricate question of Lipschitz continuity of gradients by using an elegant and easy to check convexity condition which captures the geometry of the constraints. This condition translates into a new descent lemma which in turn leads to a natural derivation of the proximal-gradient scheme with Bregman distances. We then identify a new notion of asymmetry measure for Bregman distances, which is central in determining the relevant step-size. These novelties allow to prove a global sublinear rate of convergence, and as a by-product, global pointwise convergence is obtained. This provides a new path to a broad spectrum of problems arising in key applications which were, until now, considered as out of reach via proximal gradient methods. We illustrate this potential by showing how our results can be applied to build new and simple schemes for Poisson inverse problems.
Mots-clés
first-order methods; composite nonsmooth convex minimization; descent lemma; proximal-gradient algorithms; complexity; Bregman distance; multiplicative Poisson linear inverse problems;
Référence
H.H. Bauschke, Jérôme Bolte et Marc Teboulle, « A Descent Lemma Beyond Lipschitz Gradient Continuity: First-Order Methods Revisited and Applications », Mathematics of Operations Research, vol. 42, n° 2, mai 2017, p. 330–348.
Voir aussi
Publié dans
Mathematics of Operations Research, vol. 42, n° 2, mai 2017, p. 330–348