Abstract
Automatic differentiation, as implemented today, does not have a simple mathematical model adapted to the needs of modern machine learning. In this work we articulate the relationships between differentiation of programs as implemented in practice and differentiation of nonsmooth functions. To this end we provide a simple class of functions, a nonsmooth calculus, and show how they apply to stochastic approximation methods. We also evidence the issue of artificial critical points created by algorithmic differentiation and show how usual methods avoid these points with probability one.
Replaced by
Jérôme Bolte, and Edouard Pauwels, “A mathematical model for automatic differentiation in machine learning”, in Advances in Neural Information Processing Systems, Hugo Larochelle, M. Ranzato, R. Hadsell, M.F. Balcan, and H. Lin (eds.), vol. 33, 2020.
Reference
Jérôme Bolte, and Edouard Pauwels, “A mathematical model for automatic differentiation in machine learning”, TSE Working Paper, n. 21-1184, January 2021.
See also
Published in
TSE Working Paper, n. 21-1184, January 2021