Original title: Favour: FAst Variance Operator for Uncertainty Rating
Authors: Thomas D. Ahle, Sahar Karimi, Peter Tak Peter Tang
Bayesian Neural Networks (BNN) help understand ML predictions by estimating uncertainty through sampling from the posterior distribution. But sampling takes time, limiting BNN’s widespread use. Past attempts propagated moments through the network, yet were slower or sacrificed accuracy. This study introduces “Favour,” a method using spiked covariance matrices, balancing quality and speed. It offers a principled variance propagation framework, efficiently updating approximations for faster inference without compromising accuracy. Favour’s performance matches 10-100 inference samples while being as fast as doing only 2-3 samples. Tested on tasks like calibration and out-of-distribution testing, Favour shows promise for time-critical applications where traditional BNN methods were too slow. Ultimately, this innovation enables broader use of BNN in scenarios demanding both speed and accuracy, making strides in leveraging uncertainty for better decision-making in machine learning.
Original article: https://arxiv.org/abs/2311.13036