Simplified Probability Models for Generative Tasks: a Rate-Distortion Approach

Gustav Eje Henter, W. Bastiaan Kleijn

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract / Description of output

We consider using sparse simplifications to denoise probabilistic sequence models for generative tasks such as speech synthesis. Our proposal is to find the least random model that remains close to the original one according to a KL-divergence constraint, a technique we call minimum entropy rate simplification (MERS). This produces a representation-independent framework for trading off simplicity and divergence, similar to rate-distortion theory. Importantly, MERS uses the cleaned model rather than the original one for the underlying probabilities in the KL-divergence, effectively reversing the conventional argument order. This promotes rather than penalizes sparsity, suppressing uncommon outcomes likely to be errors. We write down the MERS equations for Markov chains, and present an iterative solution procedure based on the Blahut-Arimoto algorithm and a bigram matrix Markov chain representation. We apply the procedure to a music-based Markov grammar, and compare the results to a simplistic thresholding scheme.
Original languageEnglish
Title of host publicationProc. EUSIPCO
Place of PublicationAalborg, Denmark
Number of pages5
Publication statusPublished - 1 Aug 2010


Dive into the research topics of 'Simplified Probability Models for Generative Tasks: a Rate-Distortion Approach'. Together they form a unique fingerprint.

Cite this