Ordering Dimensions with Nested Dropout Normalizing Flows

Arturs Bekasovs, Iain Murray

Research output: Contribution to conferencePaperpeer-review

Abstract / Description of output

The latent space of normalizing flows must be of the same dimensionality as their output space. This constraint presents a problem if we want to learn low-dimensional, semantically meaningful representations. Recent work has provided compact representations by fitting flows constrained to manifolds, but hasn’t defined a density off that manifold. In this work we consider flows with full support in data space, but with ordered latent variables. Like in PCA, the leading latent dimensions define a sequence of manifolds that lie close to the data. We note a trade-off between the flow likelihood and the quality of the ordering, depending on the parameterization of the flow.
Original languageEnglish
Number of pages6
Publication statusPublished - 18 Jul 2020
EventICML Workshop on Invertible Neural Networks, Normalizing Flows, and Explicit Likelihood Models 2020 - Virtual workshop
Duration: 13 Jul 202013 Jul 2020
https://invertibleworkshop.github.io/index.html

Workshop

WorkshopICML Workshop on Invertible Neural Networks, Normalizing Flows, and Explicit Likelihood Models 2020
Abbreviated titleINNF+ 2020
CityVirtual workshop
Period13/07/2013/07/20
Internet address

Fingerprint

Dive into the research topics of 'Ordering Dimensions with Nested Dropout Normalizing Flows'. Together they form a unique fingerprint.

Cite this