Original title: Guided Flows for Generative Modeling and Decision Making
Authors: Qinqing Zheng, Matt Le, Neta Shaul, Yaron Lipman, Aditya Grover, Ricky T. Q. Chen
The article delves into enhancing conditional generative models using classifier-free guidance, a pivotal factor in improving sample quality. While this approach has boosted performance in diffusion models, its application in Flow Matching models—specifically Continuous Normalizing Flows (CNFs)—remains unexplored. The study investigates “Guided Flows” and their impact across various tasks: conditional image generation, speech synthesis, and offline reinforcement learning—a first for flow models. They reveal that Guided Flows notably enhance sample quality in image generation and zero-shot text-to-speech synthesis. Moreover, these flows exhibit efficiency, requiring minimal computation while maintaining the agent’s overall performance in reinforcement learning tasks. The research unveils the potential of Guided Flows as a versatile and effective tool for improving conditional generative models in diverse applications.
Original article: https://arxiv.org/abs/2311.13443