Abstract / Description of output
Recently, Zhang and Chen [25] have proposed the Diffusion Exponential Integrator Sampler (DEIS) for fast generation of samples from Diffusion Models. It leverages the semi-linear nature of the probability flow ordinary differential equation (ODE) in order to greatly reduce integration error and improve generation quality at low numbers of function evaluations (NFEs). Key to this approach is the score function reparameterisation, which reduces the integration error incurred from using a fixed score function estimate over each integration step. The original authors use the default parameterisation used by models trained for noise prediction – multiply the score by the standard deviation of the conditional forward noising distribution. We find that although the mean absolute value of this score parameterisation is close to constant for a large portion of the reverse sampling process, it changes rapidly at the end of sampling. As a simple fix, we propose to instead reparameterise the score (at inference) by dividing it by the average absolute value of previous score estimates at that time step collected from offline high NFE generations. We find that our score normalisation (DEIS-SN) consistently improves FID compared to vanilla DEIS, showing an improvement at 10 NFEs from 6.44 to 5.57 on CIFAR-10 and from 5.9 to 4.95 on LSUN-Church (64×64). Our code is available at https://github.com/mtkresearch/Diffusion-DEIS-SN.
Original language | English |
---|---|
Pages | 1-9 |
Number of pages | 9 |
DOIs | |
Publication status | Published - 15 Dec 2023 |
Event | NeurIPS 2023 Workshop on Diffusion Models - New Orleans, United States Duration: 15 Dec 2023 → 15 Dec 2023 https://diffusionworkshop.github.io/ |
Workshop
Workshop | NeurIPS 2023 Workshop on Diffusion Models |
---|---|
Country/Territory | United States |
City | New Orleans |
Period | 15/12/23 → 15/12/23 |
Internet address |