Distilling Representations from GAN Generator via Squeeze and Span

Yu Yang*, Xiaotian Cheng, Chang Liu, Hakan Bilen, Xiangyang Ji

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

In recent years, generative adversarial networks (GANs) have been an actively studied topic and shown to successfully produce high-quality realistic images in various domains. The controllable synthesis ability of GAN generators suggests that they maintain informative, disentangled, and explainable image representations, but leveraging and transferring their representations to downstream tasks is largely unexplored. In this paper, we propose to distill knowledge from GAN generators by squeezing and spanning their representations. We squeeze the generator features into representations that are invariant to semantic-preserving transformations through a network before they are distilled into the student network. We span the distilled representation of the synthetic domain to the real domain by also using real training data to remedy the mode collapse of GANs and boost the student network performance in a real domain. Experiments justify the efficacy of our method and reveal its great significance in self-supervised representation learning. Code will be made public.
Original languageEnglish
Title of host publicationNeural Information Processing Systems 2022(NeurIPS)
Number of pages14
Publication statusAccepted/In press - 14 Sep 2022
EventThe 36th Conference on Neural Information Processing Systems, 2022 - New Orleans, United States
Duration: 28 Nov 20229 Dec 2022
Conference number: 36
https://neurips.cc/Conferences/2022

Conference

ConferenceThe 36th Conference on Neural Information Processing Systems, 2022
Abbreviated titleNeurIPS 2022
Country/TerritoryUnited States
CityNew Orleans
Period28/11/229/12/22
Internet address

Fingerprint

Dive into the research topics of 'Distilling Representations from GAN Generator via Squeeze and Span'. Together they form a unique fingerprint.

Cite this