SCML Seminar on Training Algorithms, Spring 2024

As the demand for data-driven solutions and complex problem-solving continues to escalate, the synergy between scientific computing and machine learning becomes increasingly crucial. The SCML seminar series serves several key objectives:

  • To keep up with cutting-edge training algorithms and theories in machine learning.
  • To establish connections between machine learning and scientific computing.
  • To share recent progress on research within the SCML group at KAUST.

Here is the list of speakers and titles:

Date and Time Place Speaker Title

TBA

B1-L0-0118

https://kaust.zoom.us/j/4406489644

Boou Jiang

(KAUST)

TBA

April 22nd, 2024.

16:00-17:00

https://kaust.zoom.us/j/4406489644

Shuhao Cao

(University of Missouri–Kansas City)

Structure-conforming Operator Learning via Transformers

February 14th, 2024.

16:00-17:00

https://kaust.zoom.us/j/4406489644

Yahong Yang

(Penn State University)

Approximation and Generalization Errors in Deep Neural Networks for Sobolev Spaces measured by Sobolev Norms

January 17th, 2024.

10:00-11:00

B1-L0-0118

https://kaust.zoom.us/j/4406489644

Bang An

(KAUST)

Unveiling Insights from "Gradient Descent Converges Linearly for Logistic Regression on Separable Data"

January 10th, 2024.

10:00-11:00

https://kaust.zoom.us/j/4406489644

Jun Sur Richard Park

(Korea Institute for Advanced Study)

Physics-informed Neural Networks for Learning the Homogenized Coefficients of Multiscale Elliptic Equations

January 3rd, 2024.

16:00-17:00

https://kaust.zoom.us/j/4406489644

Mingchao Cai

(Morgan State University)

A Combination of Physics-informed Neural Networks with the Fixed-stress Splitting Iteration for Solving Biot’s Model