Projects per year
Abstract
We present a new class of adaptive stochastic optimization algorithms, which overcomes many of the known shortcomings of popular adaptive optimizers that are currently used for the fine tuning of artificial neural networks (ANNs). Its underpinning theory relies on advances of Euler's polygonal approximations for stochastic differential equations (SDEs) with monotone coefficients. As a result, it inherits the stability properties of tamed algorithms, while it addresses other known issues, e.g. vanishing gradients in ANNs. In particular, we provide an nonasymptotic analysis and full theoretical guarantees for the convergence properties of an algorithm of this novel class, which we named TH$\varepsilon$O POULA (or, simply, TheoPouLa). Finally, several experiments are presented with different types of ANNs, which show the superior performance of TheoPouLa over many popular adaptive optimization algorithms.
Original language | English |
---|---|
Pages (from-to) | 1-52 |
Journal | Journal of Machine Learning Research |
Volume | 25 |
Issue number | 53 |
Publication status | Published - 28 Feb 2024 |
Keywords / Materials (for Non-textual outputs)
- cs.LG
- math.OC
- math.PR
- stat.ML
Fingerprint
Dive into the research topics of 'Polygonal Unadjusted Langevin Algorithms: Creating stable and efficient adaptive algorithms for neural networks'. Together they form a unique fingerprint.Projects
- 1 Finished
-
TRAIN@Ed: Transnational Research and Innovation Network at Edinburgh
Gorjanc, G., Bell, C., Duncan, A., Farrington, S., Florian, L., Forde, M., Hickey, J., Lacka, E., Ma, T., Mcneill, G., Medina-Lopez, E., Rosser, S., Rossi, R., Sabanis, S., Szpruch, L., Tenesa, A., Wake, D., Williamson, B. & Yang, Y.
EU government bodies, Non-EU industry, commerce and public corporations
1/11/19 → 19/04/23
Project: Research