Can you use Langevin Dynamics for predictive coding in sample inference?

Original title: Sample as You Infer: Predictive Coding With Langevin Dynamics

Authors: Umais Zahid, Qinghai Guo, Zafeirios Fountas

In this article, the authors propose a new algorithm for parameter learning in deep generative models. They draw inspiration from the field of computational neuroscience and modify the predictive coding framework to improve performance compared to the standard variational auto-encoder (VAE) training method. By injecting Gaussian noise into the inference process, they transform it into an overdamped Langevin sampling, which allows for optimization with respect to a tight evidence lower bound (ELBO). To further enhance their approach, they incorporate an encoder network to provide a warm start to the Langevin sampling process. Additionally, they introduce a lightweight and easily computable form of preconditioning to increase robustness and reduce sensitivity to curvature. The authors compare their method to VAEs using various metrics and find that their technique outperforms or matches performance while requiring fewer training iterations.

Original article: https://arxiv.org/abs/2311.13664