Statistical applications of contrastive learning

Michael U Gutmann, Steven Kleinegesse, Benjamin Rhodes

Research output: Contribution to journalArticlepeer-review

Abstract / Description of output

The likelihood function plays a crucial role in statistical inference and experimental design. However, it is computationally intractable for several important classes of statistical models, including energy-based models and simulator-based models. Contrastive learning is an intuitive and computationally feasible alternative to likelihood-based learning. We here first provide an introduction to contrastive learning and then show how we can use it to derive methods for diverse statistical problems, namely parameter estimation for energy-based models, Bayesian inference for simulator-based models, as well as experimental design.
Original languageEnglish
Pages (from-to)277-301
Number of pages25
JournalBehaviormetrika
Volume49
Issue number2
Early online date3 Jun 2022
DOIs
Publication statusPublished - 1 Jul 2022

Keywords / Materials (for Non-textual outputs)

  • Contrastive learning
  • energy-based models
  • simulator-based models
  • parameter estimation
  • Bayesian inference
  • Bayesian experimental design

Fingerprint

Dive into the research topics of 'Statistical applications of contrastive learning'. Together they form a unique fingerprint.

Cite this