Generative flow networks for discrete probabilistic modeling

Dinghuai Zhang*, Nikolay Malkin, Zhen Liu, Alexandra Volokhova, Aaron Courville, Yoshua Bengio

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

We present energy-based generative flow networks (EB-GFN), a novel probabilistic modeling algorithm for high-dimensional discrete data. Building upon the theory of generative flow networks (GFlowNets; Bengio et al., 2021b), we model the generation process by a stochastic data construction policy and thus amortize expensive MCMC exploration into a fixed number of actions sampled from a GFlowNet. We show how GFlowNets can approximately perform large-block Gibbs sampling to mix between modes. We propose a framework to jointly train a GFlowNet with an energy function, so that the GFlowNet learns to sample from the energy distribution, while the energy learns with an approximate MLE objective with negative samples from the GFlowNet. We demonstrate EB-GFN's effectiveness on various probabilistic modeling tasks. Code is publicly available at github.com/zdhNarsil/EB GFN.
Original languageEnglish
Title of host publicationProceedings of the 39th International Conference on Machine Learning
PublisherPMLR
Pages26412-26428
Number of pages17
Volume162
Publication statusPublished - 23 Jul 2022
Event39th International Conference on Machine Learning - Baltimore, United States
Duration: 17 Jul 202223 Jul 2022
Conference number: 39
https://icml.cc/Conferences/2022

Publication series

NameProceedings of Machine Learning Research
PublisherPMLR
ISSN (Print)2640-3498

Conference

Conference39th International Conference on Machine Learning
Abbreviated titleICML 2022
Country/TerritoryUnited States
CityBaltimore
Period17/07/2223/07/22
Internet address

Fingerprint

Dive into the research topics of 'Generative flow networks for discrete probabilistic modeling'. Together they form a unique fingerprint.

Cite this