Spring Semesters

KAUST CEMSE Professor Raul Tempone


COURSE OBJECTIVES: The student who follows this course will get acquainted with stochastic computational tools and their mathematical foundations, used to analyze systems with uncertainty arising in engineering, physics, chemistry, and economics. By the end of the course, the student must be able to: 

  • Analyze the convergence and computational work of sampling algorithms 
  • Implement sampling methods for different stochastic processes
  • Propose efficient sampling methods for different stochastic problems 


Probability Basics [1, 3, 6] 

  • Probability refresher
  • Conditional Expectation  
  • Bayesian Statistics 

The Monte Carlo Method [1, 3, 7] 

  • Sampling of random variables
  • The Central Limit Theorem and related results 
  • Convergence rates and confidence intervals
  • Variance reduction techniques
  • Large deviations and Rare events simulations
  • Kernel density estimators
  • Chernoff inequality and large deviations
  • Resampling techniques 

Linear and Gaussian Models [3, 6]

Discrete Time Markov Chains [5]

Bayesian Filters, Kalman Filters and state estimation [4] 

Continuous-Time Markov Chains [5] 

  • Pure jump processes 
  • SSA and Tau leap 

Markov Chain Monte Carlo [2, 7] 

Other topics that may be addressed if time allows: Continuous time and space Markov Chains, Stochastic optimization, Stochastic Optimal Control, Optimal Experimental Design, Model Selection and Model Validation, Machine Learning. 



[1]  S. Asmussen and P. W. Glynn. Stochastic Simulation: Algorithms and Analysis. Springer, 2007. A broad treatment of sampling-based computational methods with mathematical analysis of the convergence properties of the methods discussed. 

[2]  D. Gamerman and H. F. Lopes. Markov Chain Monte Carlo. Chapman & Hall/CRC, second edition, 2006. 

[3]  P. W. Glynn. Stochastic Methods in Engineering. Stanford University. Lecture Notes for the corresponding course at Stanford. 

[4]  J. Kaipio and E. Somersalo. Statistical and Computational Inverse Problems. Springer, 2005. 

[5]  J. R. Norris. Markov Chains. Cambridge University Press, 1997. Discrete-time and continuous- time Markov chains with applications to simulation in Chemistry and Physics, economics, opti- mal control, genetics, queues and many other topics. 

[6]  D. S. Sivia and J. Skilling. Data Analysis, a Bayesian Tutorial. Oxford University Press, second edition, 2006. Basic principles of Bayesian probability theory – their use illustrated with exam- ples ranging from elementary parameter estimation to image processing. Other topics covered include reliability analysis, multivariate optimization, least-squares and maximum likelihood, error-propagation, hypothesis testing, maximum entropy and experimental design. 

[7]  J. Voss. An Introduction to Statistical Computing. Wiley, 2014.


Related link:

Program Guide 2022-2023