Waste-free Sequential Monte Carlo

  • Prof. Nicolas Chopin, Professor of Statistics, ENSAE, Paris
-

KAUST

A standard way to move particles in a SMC sampler is to apply several steps of a MCMC (Markov chain Monte Carlo) kernel. Unfortunately, it is not clear how many steps need to be performed for optimal performance. In addition, the output of the intermediate steps are discarded and thus wasted somehow. We propose a new, waste-free SMC algorithm which uses the outputs of all these intermediate MCMC steps as particles. We establish that its output is consistent and asymptotically normal. We use the expression of the asymptotic variance to develop various insights on how to implement the algorithm in practice. We develop in particular a method to estimate, from a single run of the algorithm, the asymptotic variance of any particle estimate. We show empirically, through a range of numerical examples, that waste-free SMC tends to outperform standard SMC samplers, and especially so in situations where the mixing of the considered MCMC kernels decrease across iterations (as in tempering or rare event problems).

Overview

Abstract

A standard way to move particles in a SMC sampler is to apply several steps of a MCMC (Markov chain Monte Carlo) kernel. Unfortunately, it is not clear how many steps need to be performed for optimal performance. In addition, the output of the intermediate steps are discarded and thus wasted somehow. We propose a new, waste-free SMC algorithm which uses the outputs of all these intermediate MCMC steps as particles. We establish that its output is consistent and asymptotically normal. We use the expression of the asymptotic variance to develop various insights on how to implement the algorithm in practice. We develop in particular a method to estimate, from a single run of the algorithm, the asymptotic variance of any particle estimate. We show empirically, through a range of numerical examples, that waste-free SMC tends to outperform standard SMC samplers, and especially so in situations where the mixing of the considered MCMC kernels decrease across iterations (as in tempering or rare event problems).

Brief Biography

Nicolas Chopin (PhD, Université Pierre et Marie Curie, Paris, 2003) is a Professor of Statistics at the ENSAE, Paris, since 2006. He was previously a lecturer at Bristol University (UK). He is or was associate editor for Annals of Statistics, Biometrika, Journal of the Royal Statistical Society, Statistics and Computing, and Statistical Methods & Applications. He has served as a member (2013-14), and secretary (2015-16) of the research section committee of the Royal Statistical Society. He received a Savage award in 2002 for his doctoral dissertation. His research intersects with computational statistics, Bayesian inference, and machine learning.

Presenters

Prof. Nicolas Chopin, Professor of Statistics, ENSAE, Paris