Density Deconvolution with Normalizing Flows

Tim Dockhorn, James A. Ritchie, Yaoliang Yu, Iain Murray

Research output: Contribution to conferencePaperpeer-review

Abstract / Description of output

Density deconvolution is the task of estimating a probability density function given only noise-corrupted samples. We can fit a Gaussian mixture model to the underlying density by maximum likelihood if the noise is normally distributed, but would like to exploit the superior density estimation performance of normalizing flows and allow for arbitrary noise distributions. Since both adjustments lead to an intractable likelihood, we resort to amortized variational inference. We demonstrate some problems involved in this approach, however, experiments on real data demonstrate that flows can already out-perform Gaussian mixtures for density deconvolution.
Original languageEnglish
Number of pages8
Publication statusPublished - 18 Jul 2020
EventICML Workshop on Invertible Neural Networks, Normalizing Flows, and Explicit Likelihood Models 2020 - Virtual workshop
Duration: 13 Jul 202013 Jul 2020
https://invertibleworkshop.github.io/index.html

Workshop

WorkshopICML Workshop on Invertible Neural Networks, Normalizing Flows, and Explicit Likelihood Models 2020
Abbreviated titleINNF+ 2020
CityVirtual workshop
Period13/07/2013/07/20
Internet address

Fingerprint

Dive into the research topics of 'Density Deconvolution with Normalizing Flows'. Together they form a unique fingerprint.

Cite this