Accelerating Self-Supervised Learning via Efficient Training Strategies

Mustafa Taha Kocyigit*, Timothy M Hospedales, Hakan Bilen

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract / Description of output

Recently the focus of the computer vision community has shifted from expensive supervised learning towards self-supervised learning of visual representations. While the performance gap between supervised and self-supervised has been narrowing, the time for training self-supervised deep networks remains an order of magnitude larger than its supervised counterparts, which hinders progress, imposes carbon cost, and limits societal benefits to institutions with substantial resources. Motivated by these issues, this paper investigates reducing the training time of recent self-supervised methods by various model-agnostic strategies that have not been used for this problem. In particular, we study three strategies: an extendable cyclic learning rate schedule, a matching progressive augmentation magnitude and image resolutions schedule, and a hard positive mining strategy based on augmentation difficulty. We show that all three methods combined lead up to 2.7 times speed-up in the training time of several self-supervised methods while retaining comparable performance to the standard self-supervised learning setting.
Original languageEnglish
Title of host publicationProceedings of the IEEE Winter Conference on Applications of Computer Vision 2023 (WACV)
Number of pages11
ISBN (Electronic)9781665493468
ISBN (Print)9781665493475
Publication statusPublished - 6 Feb 2023
EventIEEE/CVF Winter Conference on Applications of Computer Vision, 2023 - Waikoloa, United States
Duration: 3 Jan 20237 Jan 2023

Publication series

NameIEEE Workshop on Applications of Computer Vision (WACV)
ISSN (Print)2472-6737
ISSN (Electronic)2642-9381


ConferenceIEEE/CVF Winter Conference on Applications of Computer Vision, 2023
Abbreviated titleWACV 2023
Country/TerritoryUnited States
Internet address


Dive into the research topics of 'Accelerating Self-Supervised Learning via Efficient Training Strategies'. Together they form a unique fingerprint.

Cite this