Article

One-step differentiation of iterative algorithms

Jérôme Bolte, Edouard Pauwels, and Samuel Vaiter

Abstract

In appropriate frameworks, automatic differentiation is transparent to the user, at the cost of being a significant computational burden when the number of operations is large. For iterative algorithms, implicit differentiation alleviates this issue but requires custom implementation of Jacobian evaluation. In this paper, we study one-step differentiation, also known as Jacobian-free backpropagation, a method as easy as automatic differentiation and as performant as implicit differentiation for fast algorithms (e.g. superlinear optimization methods). We provide a complete theoretical approximation analysis with specific examples (Newton's method, gradient descent) along with its consequences in bilevel optimization. Several numerical examples illustrate the well-foundness of the one-step estimator.

Reference

Jérôme Bolte, Edouard Pauwels, and Samuel Vaiter, One-step differentiation of iterative algorithms, in Advances in Neural Information Processing Systems 36, A. Oh, T. Naumann, A. Globerson, K. Saenko, M. Hardt, and Sydney Levine (eds.), 2023, pp. 77089–77103.

See also

Published in

Advances in Neural Information Processing Systems 36, A. Oh, T. Naumann, A. Globerson, K. Saenko, M. Hardt, and Sydney Levine (eds.), 2023, pp. 77089–77103