Abstract / Description of output
Weighted model counting (WMC) has emerged as the unifying inference mechanism across many (probabilistic) domains. Encoding an inference problem as an instance of WMC typically necessitates adding extra literals and clauses. This is partly so because the predominant definition of WMC assigns weights to models based on weights on literals, and this severely restricts what probability distributions can be represented. We develop a measure-theoretic perspective on WMC and propose a way to encode conditional weights on literals analogously to conditional probabilities. This representation can be as succinct as standardWMC with weights on literals but can also expand as needed to represent probability distributions with less structure. To demonstrate the performance benefits of conditional weights over the addition of extra literals, we develop a new WMC encoding for Bayesian networks and adapt a state-of-the-art WMC algorithm ADDMC to the new format. Our experiments show that the new encoding significantly improves the performance of the algorithm on most benchmark instances.
Original language | English |
---|---|
Title of host publication | Proceedings of the the 37th Conference on Uncertainty in Artificial Intelligence (UAI 2021) |
Editors | Cassio de Campos, Marloes H. Maathuis |
Place of Publication | New York, United States |
Publisher | PMLR |
Pages | 386-396 |
Number of pages | 11 |
Volume | 161 |
Publication status | Published - 27 Jul 2021 |
Event | 37th Conference on Uncertainty in Artificial Intelligence - Online Duration: 27 Jul 2021 → 30 Jul 2021 https://www.auai.org/uai2021/ |
Publication series
Name | Proceedings of Machine Learning Research |
---|---|
Publisher | PMLR |
Volume | 161 |
ISSN (Electronic) | 2640-3498 |
Conference
Conference | 37th Conference on Uncertainty in Artificial Intelligence |
---|---|
Abbreviated title | UAI 2021 |
Period | 27/07/21 → 30/07/21 |
Internet address |