Closed-Form Results for Prior Constraints in Sum-Product Networks

Giannis Papantonis, Vaishak Belle

Research output: Contribution to journalArticlepeer-review

Abstract / Description of output

Incorporating constraints is a major concern in probabilistic machine learning. A wide variety of problems require predictions to be integrated with reasoning about constraints, from modeling routes on maps to approving loan predictions. In the former, we may require the prediction model to respect the presence of physical paths between the nodes on the map, and in the latter, we may require that the prediction model respect fairness constraints that ensure that outcomes are not subject to bias. Broadly speaking, constraints may be probabilistic, logical or causal, but the overarching challenge is to determine if and how a model can be learnt that handles a declared constraint. To the best of our knowledge, treating this in a general way is largely an open problem. In this paper, we investigate how the learning of sum-product networks, a newly introduced and increasingly popular class of tractable probabilistic models, is possible with declared constraints. We obtain correctness results about the training of these models, by establishing a relationship between probabilistic constraints and the model's parameters.
Original languageEnglish
Article number644062
Number of pages11
JournalFrontiers in Artificial Intelligence
Volume4
DOIs
Publication statusPublished - 8 Apr 2021

Keywords / Materials (for Non-textual outputs)

  • sum-product network
  • constraints
  • tractable models
  • optimization
  • machine learning

Fingerprint

Dive into the research topics of 'Closed-Form Results for Prior Constraints in Sum-Product Networks'. Together they form a unique fingerprint.

Cite this