How Accurate Are Diffusion-Based Generative Models in Log-Concave Scenarios?

Original title: On diffusion-based generative models and their error bounds: The log-concave case with full convergence estimates

Authors: Stefano Bruno, Ying Zhang, Dong-Young Lim, Ă–mer Deniz Akyildiz, Sotirios Sabanis

The article explores diffusion-based generative models and their reliability when dealing with strongly logconcave data distributions. They offer solid theoretical assurances about these models’ convergence under specific conditions: assuming Lipschitz continuous functions for score estimation and strongly logconcave data distributions. Using the example of sampling from an unknown mean Gaussian distribution, they showcase the effectiveness of their approach. They provide explicit estimates for optimization problems, like score approximation, and combine these with sampling estimates. This combination yields the best upper bound estimates, particularly concerning dimensions and convergence rates, measured through the Wasserstein-2 distance between the data distribution and their sampling algorithm. Their method accommodates various stochastic optimizers and relies on $L^2$-accurate score estimation, enhancing convergence rates by utilizing known information. Overall, their approach sets a high bar for convergence rates in their sampling algorithm.

Original article: https://arxiv.org/abs/2311.13584