Dropout and Dropconnect on a shallow neural network
New publications

Asymptotic convergence rate of Dropout on shallow linear neural networks

We have submitted Asymptotic convergence rate of Dropout on shallow linear neural networks, and it is currently under review. This is joint work between Albert Senen-Cerda and myself. A preprint is available on arXiv.

Abstract

We analyze the convergence rate of gradient flows on objective functions induced by Dropout and Dropconnect, when applying them to shallow linear Neural Networks (NNs) – which can also be viewed as doing matrix factorization using a particular regularizer. Dropout algorithms such as these are thus regularization techniques that use 0,1-valued random variables to filter weights during training in order to avoid coadaptation of features. By leveraging a recent result on nonconvex optimization and conducting a careful analysis of the set of minimizers as well as the Hessian of the loss function, we are able to obtain (i) a local convergence proof of the gradient flow and (ii) a bound on the convergence rate that depends on the data, the dropout probability, and the width of the NN. Finally, we compare this theoretical bound to numerical simulations, which are in qualitative agreement with the convergence bound and match it when starting sufficiently close to a minimizer.

Preprint

Loader Loading…
EAD Logo Taking too long?

Reload Reload document
| Open Open in new tab

Download

Curious for more?

Head on over to My Articles for more of my work, and check out My Research for a peek into upcoming themes. You can also find out who is on our team right here: Academic Supervision.

Jaron
Jaron
Jaron Sanders received in 2012 M.Sc. degrees in Mathematics and Physics from the Eindhoven University of Technology, The Netherlands, as well as a PhD degree in Mathematics in 2016. After he obtained his PhD degree, he worked as a post-doctoral researcher at the KTH Royal Institute of Technology in Stockholm, Sweden. Jaron Sanders then worked as an assistant professor at the Delft University of Technology, and now works as an assistant professor at the Eindhoven University of Technology. His research interests are applied probability, queueing theory, stochastic optimization, stochastic networks, wireless networks, and interacting (particle) systems.
https://www.jaronsanders.nl