BatchGFN: Generative flow networks for batch active learning

Shreshth A. Malik*, Salem Lahlou, Andrew Jesson, Moksh Jain, Nikolay Malkin, Tristan Deleu, Yoshua Bengio, Yarin Gal

*Corresponding author for this work

Research output: Contribution to conferencePosterpeer-review

Abstract / Description of output

We introduce BatchGFN -- a novel approach for pool-based active learning that uses generative flow networks to sample sets of data points proportional to a batch reward. With an appropriate reward function to quantify the utility of acquiring a batch, such as the joint mutual information between the batch and the model parameters, BatchGFN is able to construct highly informative batches for active learning in a principled way. We show our approach enables sampling near-optimal utility batches at inference time with a single forward pass per point in the batch in toy regression problems. This alleviates the computational complexity of batch-aware algorithms and removes the need for greedy approximations to find maximizers for the batch reward. We also present early results for amortizing training across acquisition steps, which will enable scaling to real-world tasks.
Original languageEnglish
Pages1-10
Number of pages10
Publication statusPublished - 28 Jul 2023
EventStructured Probabilistic Inference and Generative Modeling Workshop -
Duration: 28 Jul 202328 Jul 2023

Workshop

WorkshopStructured Probabilistic Inference and Generative Modeling Workshop
Abbreviated titleSPIGM@ICML 2023
Period28/07/2328/07/23

Cite this