By generalizing a classical statistical model and adapting it for use in analyzing the extremes of rainfall in large datasets, researchers, including KAUST’s Raphaël Huser, have devised a more efficient and flexible analytical tool that promises to improve the prediction of flood risk and other extreme weather phenomena.
Rare extreme weather events, such as floods, extreme winds, high temperatures and drought, can be devastating, but predicting the frequency and severity of such conditions remains one of the key challenges in statistical science. Even large, long-term data sets over extensive areas may include very few extreme events, making it exceptionally difficult to predict future events with accuracy.
“There are classically two ways to model extreme events, the ‘block maximum’ approach, where we look at the largest events in blocks of time and the ‘threshold exceedance’ approach, which selects the top few percent of events across the entire timeframe of the dataset,” explains Huser, who undertook the work in collaboration with U.S.-based colleagues Gregory Bopp and Benjamin Shaby. “Previous work has developed new tools to apply the threshold exceedance approach; in this study we generalized a classical block maximum model for application to extreme precipitation.”
The block maximum approach has a long tradition in the statistics of extremes, but it has a high computational cost that limits its application to the large-scale datasets now routinely acquired in weather prediction. This approach is also unable to capture the observed weakening of the dependence between nearby conditions as events become more extreme.
The team’s approach addresses both of these shortcomings by adapting a relatively inflexible, but computationally efficient, max-stable model using Bayesian inference, which is a statistical estimation approach that provides a natural way of incorporating expert opinion and accounting for various sources of variability.
Read the full article