Working paper

FastPart: Over-Parameterized Stochastic Gradient Descent for Sparse optimisation on Measures

Sébastien Gadat, Yohann De Castro, and Clément Marteau

Abstract

This paper presents a novel algorithm that leverages Stochastic Gradient Descent strategies in con-junction with Random Features to augment the scalability of Conic Particle Gradient Descent (CPGD) specifically tailored for solving sparse optimisation problems on measures. By formulating the CPGD steps within a variational framework, we provide rigorous mathematical proofs demonstrating the fol-lowing key findings: (i) The total variation norms of the solution measures along the descent trajectory remain bounded, ensuring stability and preventing undesirable divergence; (ii) We establish a global convergence guarantee with a convergence rate of O(log(K)/√K) over K iterations, showcasing the efficiency and effectiveness of our algorithm, (iii) Additionally, we analyze and establish local control over the first-order condition discrepancy, contributing to a deeper understanding of the algorithm’s behavior and reliability in practical applications.

Reference

Sébastien Gadat, Yohann De Castro, and Clément Marteau, FastPart: Over-Parameterized Stochastic Gradient Descent for Sparse optimisation on Measures, TSE Working Paper, n. 23-1494, December 2023.

See also

Published in

TSE Working Paper, n. 23-1494, December 2023