Compiling discrete probabilistic programs for vectorized exact inference

Jingwen Pan, Amir Shaikhha

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract / Description of output

Probabilistic programming languages (PPLs) are essential for reasoning under uncertainty. Even though many real-world probabilistic programs involve discrete distributions, the state-of-the-art PPLs are suboptimal for a large class of tasks dealing with such distributions. In this paper, we propose BayesTensor, a tensor-based probabilistic programming framework. By generating tensor algebra code from probabilistic programs, BayesTensor takes advantage of the highly-tuned vectorized implementations of tensor processing frameworks. Our experiments show that BayesTensor outperforms the state-of-the-art frameworks in a variety of discrete probabilistic programs, inference over Bayesian Networks, and real-world probabilistic programs employed in data processing systems.
Original languageEnglish
Title of host publicationCC 2023 - Proceedings of the 32nd ACM SIGPLAN International Conference on Compiler Construction
EditorsClark Verbrugge, Ondrej Lhotak, Xipeng Shen
PublisherACM
Pages13-24
Number of pages12
ISBN (Electronic)9798400700880
DOIs
Publication statusPublished - 17 Feb 2023
Event32nd ACM SIGPLAN International Conference on Compiler Construction - Montreal, Canada
Duration: 25 Feb 202326 Feb 2023

Conference

Conference32nd ACM SIGPLAN International Conference on Compiler Construction
Abbreviated titleCC 2023
Country/TerritoryCanada
CityMontreal
Period25/02/2326/02/23

Keywords / Materials (for Non-textual outputs)

  • cardinality estimation
  • discrete distribution
  • probabilistic programming
  • tensor algebra

Fingerprint

Dive into the research topics of 'Compiling discrete probabilistic programs for vectorized exact inference'. Together they form a unique fingerprint.

Cite this