Monday, November 27, 2023, 11:30
- 12:30
Building 9, Level 2, Room 2325, Hall 2
Contact Person
We develop a derivative-free global minimization algorithm that is based on a gradient flow of a relaxed functional. We combine relaxation ideas, Monte Carlo methods, and resampling techniques with advanced error estimates. Compared with well-established algorithms, the proposed algorithm has a high success rate in a broad class of functions, including convex, non-convex, and non-smooth functions, while keeping the number of evaluations of the objective function small.