Learning Bayesian Random Cutset Forests

Nicola Di Mauro, Antonio Vergari, Teresa M. A. Basile

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract / Description of output

In the Probabilistic Graphical Model (PGM) community there is an interest around tractable models, i.e., those that can guarantee exact inference even at the price of expressiveness. Structure learning algorithms are interesting tools to automatically infer both these architectures and their parameters from data. Even if the resulting models are efficient at inference time, learning them can be very slow in practice. Here we focus on Cutset Networks (CNets), a recently introduced tractable PGM representing weighted probabilistic model trees with tree-structured models as leaves. CNets have been shown to be easy to learn, and yet fairly accurate. We propose a learning algorithm that aims to improve their average test log-likelihood while preserving efficiency during learning by adopting a random forest approach. We combine more CNets, learned in a generative Bayesian framework, into a generative mixture model. A thorough empirical comparison on real word datasets, against the original learning algorithms extended to our ensembling approach, proves the validity of our approach.
Original languageEnglish
Title of host publicationFoundations of Intelligent Systems
Subtitle of host publication22nd International Symposium, ISMIS 2015, Lyon, France, October 21-23, 2015, Proceedings
EditorsFloriana Esposito, Olivier Pivert, Mohand-Said Hacid, Zbigniew W. Rás, Stefano Ferilli
Place of PublicationCham
PublisherSpringer International Publishing Switzerland
Number of pages11
ISBN (Electronic)978-3-319-25252-0
ISBN (Print)978-3-319-25251-3
Publication statusPublished - 30 Dec 2015
Event22nd International Symposium on Methodologies for Intelligent Systems - Lyon, France
Duration: 21 Oct 201523 Oct 2015
Conference number: 22

Publication series

NameLecture Notes in Computer Science
PublisherSpringer, Cham
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349


Symposium22nd International Symposium on Methodologies for Intelligent Systems
Abbreviated titleISMIS 2015

Keywords / Materials (for Non-textual outputs)

  • random forest
  • expectation maximization
  • frequent itemset
  • leaf tree
  • real world dataset


Dive into the research topics of 'Learning Bayesian Random Cutset Forests'. Together they form a unique fingerprint.

Cite this