Abstract: In the talk we propose a novel and practical variance reduction approach for additive functionals of dependent sequences. Our approach combines the use of control variates with the minimisation of an empirical variance estimate. We analyse finite sample properties of the proposed method and derive finite-time bounds of the excess asymptotic variance to zero. We apply our methodology to Stochastic Gradient MCMC (SGMCMC) methods for Bayesian inference on large data sets and combine it with existing variance reduction methods for SGMCMC. We present empirical results carried out on a number of benchmark examples showing that our variance reduction method achieves significant improvement as compared to state-of-the-art methods at the expense of a moderate increase of computational overhead. The talk is based on the joint work with my colleagues Denis Belomestny, Leonid Iosipoi, Eric Moulines and Sergey Samsonov.
Variance reduction for dependent sequences with applications to Stochastic Gradient MCMC: Link to work
Variance reduction for Markov chains with application to MCMC:
Link to work