On the Relation Between the Sharpest Directions of DNN Loss and the SGD Step Length

Stanislaw Jastrzębski, Zachary Kenton, Nicolas Ballas, Asja Fischer, Yoshua Bengio, Amos Storkey

Research output: Contribution to conferencePaperpeer-review

Abstract / Description of output

Stochastic Gradient Descent (SGD) based training of neural networks with a large learning rate or a small batch-size typically ends in well-generalizing, flat regions of the weight space, as indicated by small eigenvalues of the Hessian of the training loss. However, the curvature along the SGD trajectory is poorly understood. An empirical investigation shows that initially SGD visits increasingly sharp regions, reaching a maximum sharpness determined by both the learning rate and the batch-size of SGD. When studying the SGD dynamics in relation to the sharpest directions in this initial phase, we find that the SGD step is large compared to the curvature and commonly fails to minimize the loss along the sharpest directions. Furthermore, using a reduced learning rate along these directions can improve training speed while leading to both sharper and better generalizing solutions compared to vanilla SGD. In summary, our analysis of the dynamics of SGD in the subspace of the sharpest directions shows that they influence the regions that SGD steers to (where larger learning rate or smaller batch size result in wider regions visited), the overall training speed, and the generalization ability of the final model.
Original languageEnglish
Number of pages19
Publication statusPublished - 2019
EventSeventh International Conference on Learning Representations - New Orleans, United States
Duration: 6 May 20199 May 2019
https://iclr.cc/

Conference

ConferenceSeventh International Conference on Learning Representations
Abbreviated titleICLR 2019
Country/TerritoryUnited States
CityNew Orleans
Period6/05/199/05/19
Internet address

Fingerprint

Dive into the research topics of 'On the Relation Between the Sharpest Directions of DNN Loss and the SGD Step Length'. Together they form a unique fingerprint.

Cite this