KAUST-CEMSE-AMCS-STAT-Graduate-Seminar-Boris-Mordukhovich-Derivative-free-nonconvex-opt-noise

Derivative-Free Methods in Nonconvex Optimization with and Without Noise

This talk addresses the study of nonconvex derivative-free optimization problems, where only information of either smooth objective functions or their noisy approximations is available. General derivative-free methods are proposed for minimizing differentiable (not necessarily convex) functions with globally Lipschitz continuous gradients, where the accuracy of approximate gradients is interacting with stepsizes and exact gradient values.

Overview

Analysis in the noiseless case guarantees convergence of the gradient sequence to the origin as well as global convergence with constructive convergence rates of the sequence of iterates under additional assumptions. In the noisy case, without any noise level information, the designed algorithms reach near-stationary points with providing estimates on the required number of iterations and function evaluations. Addressing functions with locally Lipschitzian gradients, two algorithms are introduced to handle the noiseless and noisy cases, respectively. The noiseless version is based on the standard backtracking linesearch and achieves fundamental convergence properties similarly to the global Lipschitzian case. The noisy version is based on a novel bidirectional linesearch and is shown to reach near-stationary points after a finite number of iterations when the Polyak-Łojasiewicz inequality is imposed. Numerical experiments are conducted on a diverse set of test problems to demonstrate robustness of the newly proposed algorithms in comparison with other finite-difference-based schemes and some production-ready codes from the SciPy library. The talk is based on joint work with P. D. Khanh and D. B. Tran.

Presenters

Boris Mordukhovich, Distinguished University Professor, Wayne State University

Brief Biography

Boris Mordukhovich is an American mathematician, who was born in the Soviet Union and emigrated to the U.S.A. in 1988. Currently he is Distinguished University Professor and Lifetime Scholar of the Academy of Scholars at Wayne State University (Vice President, 2009–2010 and President, 2010–2011). Boris is recognized for his research in the areas of nonlinear analysis, optimization, and control theory, and is one of the founders of modern variational analysis. He developed constructions of generalized differentiation, bearing now his name. His theory and its applications in many fields have been summarized in a 2-volume monograph. He has published more than 500 journal papers and several books. Boris is an AMS Fellow of the Inaugural Class, a SIAM Fellow, and a recipient of many international awards and honors, including Dr. Honoris Causa from the National Sun Yat-sen University in Taiwan, University of Messina, University of Alicante in Spain, Babeș-Bolyai University, Romania, and from Vietnam Academy of Science and Technology. He is the Founding Editor (2008) and was a co-Editor-in-Chief (2009–2014) of the journal Set-Valued and Variational Analysis. Since 2016, he is a co-Editor-in-Chief of Applied Analysis and Optimization, and since 2021 he is the area editor of Journal of Optimization Theory and Application. He was Chair (2012-2015) of the International Working Group on Generalized Convexity and Monotonicity. In 2016, he was elected to the Accademia Peloritana dei Pericolanti in Italy, and in 2021 he became a Foreign Member of the National Academy of Sciences of Ukraine. He is on the list of Highly Cited Researchers in Mathematics and Inaugural ScholarGPS Highly Ranked Scholar in Mathematical Optimization, 2024.