Exploration of the (non-)asymptotic bias and variance of stochastic gradient langevin dynamics

Abstract

Applying standard Markov chain Monte Carlo (MCMC) algorithms to large data sets is computationally infeasible. The recently proposed stochastic gradient Langevin dynamics (SGLD) method circumvents this problem in three ways: it generates proposed moves using …

Publication
J. Mach. Learn. Res.