Markov Chain Truncation for Doubly-Intractable Inference

Colin Wei, Iain Murray

Research output: Contribution to journalArticlepeer-review


Computing partition functions, the normalizing constants of probability distributions, is often hard. Variants of importance sampling give unbiased estimates of a normalizer Z, however, unbiased estimates of the reciprocal
1=Z are harder to obtain. Unbiased estimates of 1=Z allow Markov chain Monte
Carlo sampling of “doubly-intractable” distributions, such as the parameter posterior for Markov Random Fields or Exponential Random Graphs. We demonstrate how to construct unbiased estimates for 1=Z given access to black-box importance sampling estimators for Z. We adapt recent work on random series truncation and Markov chain coupling, producing estimators with lower variance and a higher percentage of positive estimates than before. Our debiasing algorithms are simple to implement, and have some theoretical and
empirical advantages over existing methods.
Original languageEnglish
Pages (from-to)776-784
Number of pages9
JournalJournal of Machine Learning Research: Workshop and Conference Proceedings
Publication statusPublished - 15 Apr 2017


Dive into the research topics of 'Markov Chain Truncation for Doubly-Intractable Inference'. Together they form a unique fingerprint.

Cite this