Data Subsampling for Bayesian Neural Networks - CNRS - Centre national de la recherche scientifique
Pré-Publication, Document De Travail Année : 2022

Data Subsampling for Bayesian Neural Networks

Résumé

Markov Chain Monte Carlo (MCMC) algorithms do not scale well for large datasets leading to difficulties in Neural Network posterior sampling. In this paper, we apply a generalization of the Metropolis Hastings algorithm that allows us to restrict the evaluation of the likelihood to small mini-batches in a Bayesian inference context. Since it requires the computation of a so-called "noise penalty" determined by the variance of the training loss function over the mini-batches, we refer to this data subsampling strategy as Penalty Bayesian Neural Networks-PBNNs. Its implementation on top of MCMC is straightforward, as the variance of the loss function merely reduces the acceptance probability. Comparing to other samplers, we empirically show that PBNN achieves good predictive performance for a given mini-batch size. Varying the size of the minibatches enables a natural calibration of the predictive distribution and provides an inbuilt protection against overfitting. We expect PBNN to be particularly suited for cases when data sets are distributed across multiple decentralized devices as typical in federated learning.
Fichier principal
Vignette du fichier
2210.09141.pdf (642.09 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04234512 , version 1 (10-10-2023)

Identifiants

Citer

Eiji Kawasaki, Markus Holzmann. Data Subsampling for Bayesian Neural Networks. 2022. ⟨hal-04234512⟩
56 Consultations
32 Téléchargements

Altmetric

Partager

More