Joint Posterior Inference for Latent Gaussian Models and extended strategies using INLA

Bayesian inference is particularly challenging on hierarchical statistical models as computational complexity becomes a significant issue. Sampling-based methods like the popular Markov Chain Monte Carlo (MCMC) can provide accurate solutions, but they likely suffer a high computational burden.

Overview

Abstract

Bayesian inference is particularly challenging on hierarchical statistical models as computational complexity becomes a significant issue. Sampling-based methods like the popular Markov Chain Monte Carlo (MCMC) can provide accurate solutions, but they likely suffer a high computational burden. An attractive alternative is the Integrated Nested Laplace Approximations (INLA) approach, which provides the highest gain in speed when applied to the broad class of Latent Gaussian Models (LGMs). The method computes fast and empirically accurate deterministic posterior marginal approximations of the model’s unknown parameters. Few internal strategies are available to get these univariate approximations, depending on the problem’s target and performance.

In the first part of this thesis, we discuss how to extend the software’s applicability to a joint posterior inference by constructing a new class of joint posterior approximations, which also add marginal corrections for location and skewness. As these approximations result from a combination of a Gaussian Copula and internally pre-computed accurate Gaussian Approximations, we name this class Skew Gaussian Copula (SGC). By computing moments and correlation structure of a mixture representation of these distributions, we achieve new accurate deterministic approximations for additive linear combinations in a subset of the model’s latent field, and these are fast to compute. In all other cases, or if more accuracy is required, we can exploit the same mixture to approximate a full joint posterior density through an exact Monte Carlo sampling on the hyperparameter set. These new approximations target the joint posterior truth by adding on-point marginal location and skewness adjustments. We set highly skewed example schemes based on Poisson and Binomial hierarchical models and verify the marginal improvements of these new approximations by comparing them with INLA and MCMC. The new skewness corrections resulting from the Skew Gaussian Copula are more consistent with the outcomes provided by the default INLA strategies.

In the second and last part, we propose an extension of the parametric fit employed by the Simplified Laplace Approximation strategy in INLA when approximating posterior marginals. By default, the strategy matches log derivatives from a third-order Taylor expansion of each Laplace Approximation marginal with those derived from Skew Normal distributions. Our idea relies on considering a fourth-order Taylor expansion and adapting an Extended Skew Normal distribution to produce an approximation fit that better corrects for skewness when this one is large. We set similarly skewed data simulations with Poisson and Binomial likelihoods and show that the posterior marginal results obtained with the new extended strategy are more accurate and coherent with the MCMC ones than its original version.

Brief Biography

Cristian Chiuchiolo got his Bachelor's degree (B.Sc.) in Statistics at the University of Firenze in 2015 and a Master's degree (M.Sc.) in Statistics at the University of Bologna in 2017 (Italy). Then he was enrolled as a Ph.D. student in Statistics at KAUST in August 2017 within Haavard Rue's research group 'Bayesian Computational Statistics & Modelling (BAYESCOMP)'. His research focuses on Computational Statistics and Bayesian Inference by developing new features and extensions for the Integrated Nested Laplace Approximations (INLA) approach applied to Latent Gaussian Models.

Presenters