Abstract / Description of output
Density deconvolution is the task of estimating a probability density function given only noise-corrupted samples. We can fit a Gaussian mixture model to the underlying density by maximum likelihood if the noise is normally distributed, but would like to exploit the superior density estimation performance of normalizing flows and allow for arbitrary noise distributions. Since both adjustments lead to an intractable likelihood, we resort to amortized variational inference. We demonstrate some problems involved in this approach, however, experiments on real data demonstrate that flows can already out-perform Gaussian mixtures for density deconvolution.
Original language | English |
---|---|
Number of pages | 8 |
Publication status | Published - 18 Jul 2020 |
Event | ICML Workshop on Invertible Neural Networks, Normalizing Flows, and Explicit Likelihood Models 2020 - Virtual workshop Duration: 13 Jul 2020 → 13 Jul 2020 https://invertibleworkshop.github.io/index.html |
Workshop
Workshop | ICML Workshop on Invertible Neural Networks, Normalizing Flows, and Explicit Likelihood Models 2020 |
---|---|
Abbreviated title | INNF+ 2020 |
City | Virtual workshop |
Period | 13/07/20 → 13/07/20 |
Internet address |