Block Neural Autoregressive Flow

Nicola De Cao, Wilker Aziz, Ivan Titov

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Normalising flows (NFs) map two density functions via a differentiable bijection whose Jacobian determinant can be computed efficiently. Recently, as an alternative to handcrafted bijections, Huang et al. (2018) proposed neural autoregressive flow (NAF) which is a universal approximator for density functions. Their flow is a neural network (NN) whose parameters are predicted by another NN. The latter grows quadratically with the size of the former and thus an efficient technique for parametrization is needed.

We propose block neural autoregressive flow (BNAF), a much more compact universal approximator of density functions, where we model a bijection directly using a single feedforward network. Invertibility is ensured by carefully designing each affine transformation with block matrices that make the flow autoregressive and (strictly) monotone. We compare B-NAF to NAF and other established flows on density estimation and approximate inference for latent variable models. Our proposed flow is competitive across datasets while using orders of magnitude fewer parameters.
Original languageEnglish
Title of host publicationProceedings of the Thirty-Fifth Conference on Uncertainty in Artificial Intelligence, UAI 2019
Subtitle of host publicationTel Aviv, Israel, July 22-25, 2019
Place of PublicationTel Aviv, Israel
Number of pages13
Publication statusPublished - 22 Jul 2019
Event35th Conference on Uncertainty in Artificial Intelligence, UAI 2019 - Tel Aviv, Israel
Duration: 22 Jul 201925 Jul 2019
http://auai.org/uai2019/

Conference

Conference35th Conference on Uncertainty in Artificial Intelligence, UAI 2019
Abbreviated titleUAI 2019
Country/TerritoryIsrael
CityTel Aviv
Period22/07/1925/07/19
Internet address

Fingerprint

Dive into the research topics of 'Block Neural Autoregressive Flow'. Together they form a unique fingerprint.

Cite this