Subgradient sampling for nonsmooth nonconvex minimization - IRIT - Université Toulouse III Paul Sabatier Accéder directement au contenu
Pré-Publication, Document De Travail Année : 2023

Subgradient sampling for nonsmooth nonconvex minimization

Jérôme Bolte
  • Fonction : Auteur
  • PersonId : 995617
Tam Le
  • Fonction : Auteur
  • PersonId : 752715
  • IdHAL : tam-le

Résumé

Risk minimization for nonsmooth nonconvex problems naturally leads to firstorder sampling or, by an abuse of terminology, to stochastic subgradient descent. We establish the convergence of this method in the path-differentiable case, and describe more precise results under additional geometric assumptions. We recover and improve results from Ermoliev-Norkin by using a different approach: conservative calculus and the ODE method. In the definable case, we show that first-order subgradient sampling avoids artificial critical point with probability one and applies moreover to a large range of risk minimization problems in deep learning, based on the backpropagation oracle. As byproducts of our approach, we obtain several results on integration of independent interest, such as an interchange result for conservative derivatives and integrals, or the definability of set-valued parameterized integrals.
Fichier principal
Vignette du fichier
subgradientSampling.pdf (453.91 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03579383 , version 1 (18-02-2022)
hal-03579383 , version 2 (25-02-2022)
hal-03579383 , version 3 (28-10-2022)
hal-03579383 , version 4 (26-01-2023)
hal-03579383 , version 5 (09-03-2023)

Identifiants

Citer

Jérôme Bolte, Tam Le, Edouard Pauwels. Subgradient sampling for nonsmooth nonconvex minimization. 2023. ⟨hal-03579383v5⟩
474 Consultations
298 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More