Neural Fine-Tuning Search for Few-Shot Learning

Panagiotis Eustratiadis, Łukasz Dudziak, Da Li, Timothy Hospedales

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract / Description of output

In few-shot recognition, a classifier that has been trained on one set of classes is required to rapidly adapt and generalize to a disjoint, novel set of classes. To that end, recent studies have shown the efficacy of fine-tuning with carefully-crafted adaptation architectures. However this raises the question of: How can one design the optimal adaptation strategy? In this paper, we study this question through the lens of neural architecture search (NAS). Given a pre-trained neural network, our algorithm discovers the optimal arrangement of adapters, which layers to keep frozen, and which to fine-tune. We demonstrate the generality of our NAS method by applying it to both residual networks and vision transformers and report state-of-the-art performance on Meta-Dataset and Meta-Album.
Original languageEnglish
Title of host publicationThe Twelfth International Conference on Learning Representations
Number of pages17
Publication statusAccepted/In press - 16 Jan 2024
EventThe Twelfth International Conference on Learning Representations - Vienna, Austria, Vienna, Austria
Duration: 7 May 202411 May 2024
https://iclr.cc/

Conference

ConferenceThe Twelfth International Conference on Learning Representations
Country/TerritoryAustria
CityVienna
Period7/05/2411/05/24
Internet address

Fingerprint

Dive into the research topics of 'Neural Fine-Tuning Search for Few-Shot Learning'. Together they form a unique fingerprint.

Cite this