Variational Inference for Cutting Feedback in Misspecified Models

Event Start
Event End
Location
KAUST

Abstract

Bayesian analyses combine information represented by different terms in a joint Bayesian model. When one or more of the terms is misspecified, it can be helpful to restrict the use of information from suspect model components to modify posterior inference. This is called "cutting feedback'', and both the specification and computation of the posterior for such cut models is challenging. In this talk we consider the definition of cut posterior distributions as solutions to constrained optimization problems, which naturally leads to optimisation-based variational computation methods. Using this perspective, we propose variational methods which are faster than existing Markov chain Monte Carlo (MCMC) approaches for computing cut posterior distributions by an order of magnitude. It is also shown that variational methods allow for the evaluation of computationally intensive conflict checks that can be used to decide whether or not feedback should be cut. The methods are illustrated in a number of simulated and real examples, including an application where recent methodological advances that combine variational inference and MCMC within the variational optimization are used.

Brief Biography

David Nott is an Associate Professor in the Department of Statistics and Data Science, National University of Singapore. He obtained his PhD from the University of Queensland, and held visiting postdoctoral appointments at Lund University and the University of New South Wales before becoming a lecturer and senior lecturer at University of New South Wales. He moved to National University of Singapore in 2007, and his research focuses on Bayesian approximate inference methods and model criticism for complex models.