Service interruption on Monday 11 July from 12:30 to 13:00: all the sites of the CCSD (HAL, EpiSciences, SciencesConf, AureHAL) will be inaccessible (network hardware connection).
Skip to Main content Skip to Navigation
Preprints, Working Papers, ...

Adaptive structured noise injection for shallow and deep neural networks

Abstract : Dropout is a regularisation technique in neural network training where unit activations are randomly set to zero with a given probability independently. In this work, we propose a generalisation of dropout and other multiplicative noise injection schemes for shallow and deep neural networks, where the random noise applied to different units is not independent but follows a joint distribution that is either fixed or estimated during training. We provide theoretical insights on why such adaptive structured noise injection (ASNI) may be relevant, and empirically confirm that it helps boost the accuracy of simple feedforward and convolutional neural networks, disentangles the hidden layer representations, and leads to sparser representations. Our proposed method is a straightforward modification of the classical dropout and does not require additional computational overhead.
Document type :
Preprints, Working Papers, ...
Complete list of metadata

Cited literature [35 references]  Display  Hide  Download
Contributor : Jean-Philippe Vert Connect in order to contact the contributor
Submitted on : Wednesday, February 20, 2019 - 1:14:51 AM
Last modification on : Wednesday, November 17, 2021 - 12:31:32 PM
Long-term archiving on: : Tuesday, May 21, 2019 - 12:22:53 PM


Files produced by the author(s)


  • HAL Id : hal-02025929, version 1


Beyrem Khalfaoui, Joseph Boyd, Jean-Philippe Vert. Adaptive structured noise injection for shallow and deep neural networks. 2019. ⟨hal-02025929⟩



Record views


Files downloads