This course gives an introduction to applied Bayesian statistics from a statistical modelling point of view. We will discuss Bayesian statistics in general and how to do simulation-based inference using Markov chain Monte Carlo (MCMC). We will then use MCMC to analyse various statistical models, discuss techniques for model criticism and comparison, prior choice, sensitivity analysis and the concept of hierarchical models. We will make extensive use of the programming language R and the JAGS program for analysis of Bayesian hierarchical models using Markov chain Monte Carlo.
This course discusses computational techniques for Bayesian and frequentistic inference. Topics to be discussed, are exact recursions for hidden Markov chains, change point models, Gaussian Markov random fields (GMRFs) and its applications in latent Gaussian models, inference for latent Gaussian models using Markov chain Monte Carlo with block-sampling and auxiliary variables, deterministic approximations in latent Gaussian models using integrated nested Laplace approximations (INLA), GMRF models for splines, approximate GMRF models, and the EM-algorithm.
This course gives a comprehensive introduction to the role of probability theory in general scientific endeavour, and is relevant for those who have to do inference from incomplete information. We will discuss various threads of modern thinking about Bayesian probability and statistical inference, and discuss Bayesian techniques with the results of other approaches.
The lectures are based on this book ''Probability theory: The logic of science, by ET Jayes, Cambridge University Press''.