Online Hyperparameter Meta-Learning with Hypergradient Distillation

Hae Beom Lee, Hayeon Lee, JaeWoong Shin, Eunho Yang, Timothy Hospedales, Sung Ju Hwang

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Many gradient-based meta-learning methods assume a set of parameters that do not participate in inner-optimization, which can be considered as hyperparameters. Although such hyperparameters can be optimized using the existing gradient-based hyperparameter optimization (HO) methods, they suffer from the following issues. Unrolled differentiation methods do not scale well to high-dimensional hyperparameters or horizon length, Implicit Function Theorem (IFT) based methods are restrictive for online optimization, and short horizon approximations suffer from short horizon bias. In this work, we propose a novel HO method that can overcome these limitations, by approximating the second-order term with knowledge distillation. Specifically, we parameterize a single Jacobian-vector product (JVP) for each HO step and minimize the distance from the true second-order term. Our method allows online optimization and also is scalable to the hyperparameter dimension and the horizon length. We demonstrate the effectiveness of our method on three different meta-learning methods and two benchmark datasets.
Original languageEnglish
Title of host publicationInternational Conference on Learning Representations (ICLR 2022)
Number of pages16
Publication statusPublished - 25 Apr 2022
EventTenth International Conference on Learning Representations 2022 - Virtual Conference
Duration: 25 Apr 202229 Apr 2022
Conference number: 10
https://iclr.cc/

Conference

ConferenceTenth International Conference on Learning Representations 2022
Abbreviated titleICLR 2022
Period25/04/2229/04/22
Internet address

Keywords / Materials (for Non-textual outputs)

  • Hyperparameter Optimization
  • Meta-learning

Fingerprint

Dive into the research topics of 'Online Hyperparameter Meta-Learning with Hypergradient Distillation'. Together they form a unique fingerprint.

Cite this