The capture of information about environments and processes is ubiquitous in modern society, from recording weather patterns to measuring traffic density and industrial process parameters. These measurements, when collated, provide data for predictive modelling and process optimization.
Determining exactly how many sensors to use and their optimal location, however, has significant implications for the reliability and value of the information obtained and for the cost of the measurement system itself.
Quan Long, Marco Scavino and Raul Tempone from KAUST, in collaboration with Suojin Wang from Texas A&M University in the United States, have developed a computational method that can quickly derive an optimized ‘experimental design’ for complex, noisy systems with little available data1.
“Experimental design is an important topic in engineering and science,” says Tempone. “It allows us to optimize the locations of sensors to achieve the best estimates and minimize uncertainties, especially for real, noisy measurements.”
Using an established approach known as Bayesian optimization, which combines data and contextual information in a mathematically rigorous framework, Tempone and his colleagues pooled their expertise in computational methods, statistics and numerical analysis of partial differential equations. From this they developed a scheme that could compute an optimized design even for poorly understood or ‘low-rank’ systems.
“As physical systems become increasingly complex, the optimization of experimental design becomes more computationally intensive,” says Tempone. “Our fast method can be used in situations where a large number of parameters need to be estimated while data are sparse.”
Read the full article