Tuesday, April 05, 2022, 15:00
- 17:00
B5, L5, R5220
Contact Person
In this thesis, we develop new flexible sub-asymptotic extreme value models for modeling spatial and spatio-temporal extremes that are combined with carefully designed gradient-based Markov chain Monte Carlo (MCMC) sampling schemes and that can be exploited to address important scientific questions related to risk assessment in a wide range of environmental applications. The methodological developments are centered around two distinct themes, namely (i) sub-asymptotic Bayesian models for extremes; and (ii) flexible marked point process models with sub-asymptotic marks. In the first part, we develop several types of new flexible models for light-tailed and heavy-tailed data, which extend a hierarchical representation of the classical generalized Pareto (GP) limit for threshold exceedances. Spatial dependence is modeled through latent processes. We study the theoretical properties of our new methodology and demonstrate it by simulation and applications to precipitation extremes in both Germany and Spain.
Monday, April 04, 2022, 17:00
- 19:00
B3, L5, R5209
Contact Person
The statistical modeling of extreme natural hazards is becoming increasingly important due to climate change, whose effects have been increasingly visible throughout the last decades. It is thus crucial to understand the dependence structure of rare, high-impact events over space and time for realistic risk assessment. For spatial extremes, max-stable processes have played a central role in modeling block maxima. However, the spatial tail dependence strength is persistent across quantile levels in those models, which is often not realistic in practice. This lack of flexibility implies that max-stable processes cannot capture weakening dependence at increasingly extreme levels, resulting in a drastic overestimation of joint tail risk. 
Thursday, March 31, 2022, 12:00
- 13:00
Building 9, Level 2, Room 2325
Contact Person
We introduce ProxSkip a surprisingly simple and provably efficient method for minimizing the sum of a smooth (ƒ) and an expensive nonsmooth proximable (ψ) function.
Thursday, March 17, 2022, 12:00
- 13:00
Building 9, Level 2, Room 2325
Contact Person
The Maxwell-Stefan system is a system of equations commonly used to describe diffusion processes of multi-component systems. In this talk (i) I will describe modeling of multi-component systems, which leads to extensions of the Euler compressible dynamics system with mass and thermal diffusion. (ii) Will describe how the Maxwell-Stefan system emerges in the high-friction limit of multi-component Euler flows. (iii) Discuss some mathematical questions that this model raises and on the construction of numerical schemes for the Maxwell-Stefan system associated with the minimization of frictional dissipation.
Thursday, March 10, 2022, 12:00
- 13:00
KAUST
Contact Person
In recent years, machine learning has proven to be efficient in solving various physical problems through data-driven approaches. For example, in wave physics, models based on analytical and numerical schemes employ intensive trial-and-error tuning of material (and geometrical) parameters for 'on demand' wave properties, which require deep understanding of the physics and are computationally expensive.  As a result, it is desired to develop intelligent models that learn the bidirectional mapping between different physical quantities and automate technological device design. In this presentation, I will discuss novel generative models for forward and inverse predictions that outperform human performance. In particular, I will show how machine learning can be used to design broadband acoustic cloaks, unidirectional non-Hermitian structures, and for solving the inverse scattering problem of shape recognition.
Monday, March 07, 2022, 15:00
- 17:00
KAUST
This thesis focuses on the use of multilevel Monte Carlo methods to achieve optimal error versus cost performance for statistical computations in hidden Markov models as well as for unbiased estimation under four cases: nonlinear filtering, unbiased filtering, unbiased estimation of hessian, continuous linear Gaussian filtering.
Thursday, March 03, 2022, 12:00
- 13:00
Building 9, Level 2, Room 2325
Contact Person
Dynamic programming is an efficient technique to solve optimization problems. It is based on decomposing the initial problem into simpler ones and solving these sub-problems beginning from the simplest ones. A conventional dynamic programming algorithm returns an optimal object from a given set of objects. We developed extensions of dynamic programming which allow us (i) to describe the set of objects under consideration, (ii) to perform a multi-stage optimization of objects relative to different criteria, (iii) to count the number of optimal objects, (iv) to find the set of Pareto optimal points for the bi-criteria optimization problem, and (v) to study the relationships between two criteria. The considered applications include optimization of decision trees and decision rule systems as algorithms for problem-solving, as ways for knowledge representation, and as classifiers, optimization of element partition trees for rectangular meshes which are used in finite element methods for solving PDEs, and multi-stage optimization for such classic combinatorial optimization problems as matrix chain multiplication, binary search trees, global sequence alignment, and shortest paths.
Thursday, February 24, 2022, 12:00
- 13:00
Building 9, Level 2, Room 2325
Contact Person
Spatially misaligned data are becoming increasingly common due to advances in both data collection and management in a wide range of scientific disciplines including the epidemiological, ecological and environmental fields. Here, we present a Bayesian geostatistical model for fusion of data obtained at point and areal resolutions. The model assumes that underlying all observations there is a spatially continuous variable that can be modeled using a Gaussian random field process.
Sunday, February 20, 2022, 16:00
- 17:30
Auditorium between B4 and 5, L0
Contact Person
Rare, low-probability events often lead to the biggest impacts. The goal of the Extreme Statistics (extSTAT) research group at KAUST is to develop cutting-edge statistical approaches for modeling, predicting and quantifying risks associated with these extreme events in complex systems arising in various scientific fields, such as climate science and finance.  In particular, the work that we develop and continue to refine has a direct potential impact to climate scientists and related stakeholders, such as engineers and insurers, who have realized that under climate change, the greatest environmental, ecological, and infrastructural risks and damages, are often caused by changes in the intensity, frequency, spatial extent, and persistence of extreme events, rather than changes in their average behavior. However, while datasets are often massive in modern day applications, extreme events are always scarce by nature. This makes it very challenging to provide reliable risk assessment and prediction, especially when extrapolation to yet-unseen levels is required.  To overcome these existing limitations, the extSTAT research group develops novel methods that transcend classical extreme-value theory to build new resilient statistical models, as well as computationally efficient inference methods, which improve the prediction of rare events in complex, high-dimensional, spatio-temporal, non-stationary settings. In this talk, I will provide an overview of my group's recent research activities and future directions with a focus on core statistical methodology contributions. The technical part of the talk will describe selected research highlights, which include (but are not limited to) the development of new flexible sub-asymptotic models applied to assessing contagion risk among leading cryptocurrencies, the development of a novel low-rank spatial modeling framework applied to estimating extreme hotspots in high-resolution Red Sea surface temperature data, and the development specialized spatio-temporal point process models applied to predicting devastating rainfall-induced landslides in a region of Italy. I will conclude my talk with an outlook on my future research plans. Motivated by methodological obstacles that arise with “big models” for complex extremes data, as well as new substantive challenges in collaborative work at KAUST, we will embark on the development of fundamentally superior models that have an intrinsically sparse probabilistic structure, as well as new "hybrid" methods that combine the strength of (parametric) models from extreme-value theory with the pragmatism and predictive power of (nonparametric) machine learning algorithms, thus opening the door to interpretable and “extreme-ly” accurate predictive models for rare events in unprecedented dimensions.
Thursday, February 17, 2022, 12:00
- 13:00
Building 9, Level 2, Room 2325
Contact Person
Biological systems are distinguished by their enormous complexity and variability. That is why mathematical modelling and computational simulation of those systems is very difficult, in particular thinking of detailed models which are based on first principles. The difficulties start with geometric modelling  which needs to extract basic structures from highly complex and variable phenotypes, on the other hand also has to take the statistic variability into account.
Prof. Mats Julius Stensrud, Department of Mathematics, EPFL
Wednesday, February 16, 2022, 16:00
- 17:00
Building 9, Level 2, Room 2322
Contact Person
A competing event is any event that makes it impossible for the outcome of interest to occur. The presence of competing events requires us to be careful about the interpretation of classical causal estimands. In particular, the average treatment effect captures effects through the competing event, pathways that may not be of primary interest. As a solution, we suggest the separable effect, inspired by Robins and Richardson’s extended graphical approach. We will give criteria that allow different interpretations of the separable effects and present identification conditions that can be evaluated in causal graphs.
Thursday, February 10, 2022, 12:00
- 13:00
KAUST
Contact Person
Advances in imaging technology have given neuroscientists unprecedented access to examine various facets of how the brain “works”. Brain activity is complex. A full understanding of brain activity requires careful study of its multi-scale spatial-temporal organization (from neurons to regions of interest; and from transient events to long-term temporal dynamics). Motivated by these challenges, we will explore some characterizations of dependence between components of a multivariate time series and then apply these to the study of brain functional connectivity.  This is potentially interesting for brain scientists because functional brain networks are associated with cognitive function and mental and neurological diseases.
Benjamin L. Gerard, Postdoctoral Scholar, University of California, UC Observatories
Wednesday, February 02, 2022, 16:30
- 17:30
KAUST
Over 4000 exoplanets - planets beyond the Solar System - have been discovered since the first Nobel prize-winning exoplanet detection around a Sun-like star in 1995. The majority of these exoplanets have been detected by indirect methods, inferring the presence of the exoplanet by observing the star.
Konrad Grabiszewski, Instructional Professor, Applied Mathematics and Computational Sciences
Thursday, January 27, 2022, 12:00
- 13:00
KAUST
Contact Person
Backward induction, the cornerstone of dynamic game theory, is the classical algorithm applied to solve finite dynamic games with perfect and complete information. While theoretically sound and beautiful in its simplicity, backward induction does not perform so well when it comes to predicting human behavior. The objective of this seminar is twofold. First, we will understand what backward induction is and how to apply it on game-theoretic trees. Second, we will answer the question of whether backward induction is a good model of how people make choices in dynamic interactions.
Professor Peter Bühlmann, Statistics and Mathematics, ETH Zürich
Tuesday, December 07, 2021, 15:30
- 16:30
Building 9, Level 2, Room 2325
Contact Person
Hidden confounding is a severe problem when interpreting regression or causal parameters, and it may also lead to poor generalization performance for prediction. Adjusting for unobserved confounding is important but challenging when based on observational data only.
Professor Peter Bühlmann, Statistics and Mathematics, ETH Zürich
Tuesday, December 07, 2021, 12:00
- 13:00
Building 9, Level 2, Room 2325
Contact Person
Reliable, robust and interpretable machine learning is a big emerging theme in data science and artificial intelligence, complementing the development of pure black box prediction algorithms. Looking through the lens of statistical causality and exploiting a probabilistic invariance property opens up new paths and opportunities for enhanced interpretation, robustness and external validity, with wide-ranging prospects for various applications.
Jinyuan Liu, PhD Student, Biostatistics, University of California, USA
Sunday, November 28, 2021, 08:30
- 09:30
KAUST
Contact Person
Breakthroughs such as high-throughput sequencing are generating flourishing high-dimensional data that provoke challenges in both statistical analyses and interpretations.
Thursday, November 25, 2021, 12:00
- 13:00
KAUST
Contact Person
When constructing high-order schemes for solving hyperbolic conservation laws with multi-dimensional finite volume schemes, the corresponding high-order reconstructions are commonly performed in characteristic spaces to eliminate spurious oscillations as much as possible.
Thursday, November 11, 2021, 12:00
- 13:00
KAUST
Contact Person
In the classical theory of the finite element approximation of elliptic partial differential equations, based on standard Galerkin schemes, the energy norm of the error decays with the same rate of convergence as the best finite element approximation, without any additional requirements on the involved spaces.
Sunday, November 07, 2021, 15:00
- 16:00
B1, L4, R4102
Contact Person
The statistical analysis based on the quantile method is more comprehensive, flexible, and not sensitive against outliers compared to the mean methods. The study of the joint disease mapping focuses on the mean regression. This means they study the correlation or the dependence between the means of the diseases by using standard regression. However, sometimes one disease limits the occurrence of another disease. In this case, the dependence between the two diseases will not be in the means but in the different quantiles; thus, the analyzes will consider a joint disease mapping of high quantile for one disease with low quantile of the other disease.