The Convex Gaussian Min-Max Theorem: A Powerful Tool for the Analysis of Regularized Convex Optimization Problems

Event Start
Event End


In modern large-scale inference problems, the dimension of the signal to be estimated is comparable or even larger than the number of available observations. Yet the signal of interest lies in some low-dimensional structure, due to sparsity, low-rankness, finite alphabet, ... etc. Non-smooth regularized convex optimization are powerful tools for the recovery of such structured signals from noisy linear measurements. Research has shifted recently to the performance analysis of these optimization tools and optimal turning of their hyper-parameters in high dimensional settings.

One powerful performance analysis framework is the Convex Gaussian Min-max Theorem (CGMT). The CGMT is based on Gaussian process methods and is a strong and tight version of the classical Gordon comparison inequality. In this talk, we review the CGMT and illustrate its application to the error analysis of some convex regularized optimization problems.  

Brief Biography

Tareq Al-Naffouri received his B.S. from King Fahd University of Petroleum and Minerals,  the M.S. from the Georgia Institute of Technology, and the Ph.D. degree from Stanford University, all in Electrical Engineering, He was a visiting scholar at California Institute of Technology in 2005, 2006, and 2008 and a Fulbright Scholar at the University of Southern California in 2008. He is currently a Professor of Electrical & Computer Engineering at KAUST. His research interests lie in the areas of sparse, adaptive, and statistical signal processing and their applications to wireless communications and smart health/cities and in network information theory and the applications of machine learning. He has over 300 publications in journal and conference proceedings and 25 issued/pending patents.