Direct Learning of Sparse Changes in Markov Networks by Density Ratio Estimation

Song Liu, John A. Quinn, Michael U. Gutmann, Taiji Suzuki, Masashi Sugiyama

Research output: Contribution to journalArticlepeer-review


We propose a new method for detecting changes in Markov network structure between two sets of samples. Instead of naively fitting two Markov network models separately to the two data sets and figuring out their difference, we directly learn the network structure change by estimating the ratio of Markov network models. This density-ratio formulation naturally allows us to introduce sparsity in the network structure change, which highly contributes to enhancing interpretability. Furthermore, computation of the normalization term, a critical bottleneck of the naive approach, can be remarkably mitigated. We also give the dual formulation of the optimization problem, which further reduces the computation cost for large-scale Markov networks. Through experiments, we demonstrate the usefulness of our method.
Original languageEnglish
Pages (from-to)1169-1197
Number of pages29
JournalNeural Computation
Issue number6
Publication statusPublished - 1 Mar 2014


Dive into the research topics of 'Direct Learning of Sparse Changes in Markov Networks by Density Ratio Estimation'. Together they form a unique fingerprint.

Cite this