"Markov chain Monte Carlo"

Distributed Bayesian learning with stochastic natural gradient expectation propagation and the posterior server

This paper makes two contributions to Bayesian machine learning algorithms. Firstly, we propose stochastic natural gradient expectation propagation (SNEP), a novel alternative to expectation propagation (EP), a popular variational inference …

Consistency and fluctuations for stochastic gradient Langevin dynamics

Applying standard Markov chain Monte Carlo (MCMC) algorithms to large data sets is computationally expensive. Both the calculation of the acceptance probability and the creation of informed proposals usually require an iteration through the whole …

Consistency and fluctuations for stochastic gradient Langevin dynamics

Applying standard Markov chain Monte Carlo (MCMC) algorithms to large data sets is computationally expensive. Both the calculation of the acceptance probability and the creation of informed proposals usually require an iteration through the whole …

Exploration of the (non-)asymptotic bias and variance of stochastic gradient langevin dynamics

Applying standard Markov chain Monte Carlo (MCMC) algorithms to large data sets is computationally infeasible. The recently proposed stochastic gradient Langevin dynamics (SGLD) method circumvents this problem in three ways: it generates proposed …