Supplement to " An Iterative Smoothing Algorithm for Regression with Structured Sparsity " - INRIA - Institut National de Recherche en Informatique et en Automatique Accéder directement au contenu
Rapport (Rapport De Recherche) Année : 2016

Supplement to " An Iterative Smoothing Algorithm for Regression with Structured Sparsity "

Résumé

High-dimensional prediction models are increasingly used to analyze biological data such as neuroimaging of genetic data sets. However, classical penalized algorithms yield to dense solutions that are difficult to interpret without arbitrary thresholding. Alternatives based on sparsity-inducing penalties suffer from coefficient instability. Complex structured sparsity-inducing penalties are a promising approach to force the solution to adhere to some domain-specific constraints and thus offering new perspectives in biomarker identification. We propose a generic optimization framework that can combine any smooth convex loss function with: (i) penalties whose proximal operator is known and (ii) with a large range of complex, non-smooth convex structured penalties such as total variation, or overlapping group lasso. Although many papers have addressed a similar goal, few have tackled it in such a generic way and in the context of high-dimensional data. The proposed continuation algorithm, called \textit{CONESTA}, dynamically smooths the complex penalties to avoid the computation of proximal operators, that are either not known or expensive to compute. The decreasing sequence of smoothing parameters is dynamically adapted, using the duality gap, in order to maintain the optimal convergence speed towards any globally desired precision with duality gap guarantee. First, we demonstrate, on both simulated data and on experimental MRI data, that CONESTA outperforms the excessive gap method, ADMM, proximal gradient smoothing (without continuation) and inexact FISTA in terms of convergence speed and/or precision of the solution. Second, on the experimental MRI data set, we establish the superiority of structured sparsity-inducing penalties ($\ell_1$ and total variation) over non-structured methods in terms of the recovery of meaningful and stable groups of predictive variables.
Fichier principal
Vignette du fichier
ols_nestv_supp.pdf (397.63 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

cea-01324021 , version 1 (31-05-2016)
cea-01324021 , version 2 (05-10-2016)
cea-01324021 , version 3 (21-11-2016)
cea-01324021 , version 4 (22-04-2018)

Identifiants

  • HAL Id : cea-01324021 , version 1

Citer

Fouad Hadj-Selem, Tommy Löfstedt, Vincent Frouin, Vincent Guillemot, Edouard Duchesnay. Supplement to " An Iterative Smoothing Algorithm for Regression with Structured Sparsity ": Supplement to ``An Iterative Smoothing Algorithm for Regression with Structured Sparsity". [Research Report] NeuroSpin, CEA, Paris-Saclay, Gif-sur-Yvette - France. 2016. ⟨cea-01324021v1⟩

Collections

LARA
882 Consultations
535 Téléchargements

Partager

Gmail Facebook X LinkedIn More