Abstract
Gradient-based meta-learning and hyperparameter optimization have seen significant progress recently, enabling practical end-to-end training of neural networks together with many hyperparameters. Nevertheless, existing approaches are relatively expensive as they need to compute second-order derivatives and store a longer computational graph. This cost prevents scaling them to larger network architectures. We present EvoGrad, a new approach to meta-learning that draws upon evolutionary techniques to more efficiently compute hypergradients. EvoGrad estimates hypergradient with respect to hyperparameters without calculating second-order gradients, or storing a longer computational graph, leading to significant improvements in efficiency. We evaluate EvoGrad on two substantial recent meta-learning applications, namely cross-domain few-shot learning with feature-wise transformations and noisy label learning with MetaWeightNet. The results show that EvoGrad significantly improves efficiency and enables scaling meta-learning to bigger CNN architectures such as from ResNet18 to ResNet34.
Original language | English |
---|---|
Title of host publication | 35th Conference on Neural Information Processing Systems (NeurIPS 2021) |
Publisher | Neural Information Processing Systems |
Number of pages | 12 |
Publication status | Published - 6 Dec 2021 |
Event | 35th Conference on Neural Information Processing Systems - Virtual Duration: 6 Dec 2021 → 14 Dec 2021 https://nips.cc/ |
Publication series
Name | Advances in Neural Information Processing Systems |
---|---|
ISSN (Print) | 1049-5258 |
Conference
Conference | 35th Conference on Neural Information Processing Systems |
---|---|
Abbreviated title | NeurIPS 2021 |
Period | 6/12/21 → 14/12/21 |
Internet address |