Academic Biography

Jinchao Xu is Director of KAUST Innovation Hub in Shenzhen,  Professor of Applied Mathematics and Computational Sciences at King Abdullah University of Science and Technology, Director of KAUST-SRIBD Joint Lab for Scientific Computing and Machine Learning, Verne M. Willaman Professor of Mathematics and Director of the Center for Computational Mathematics and Applications at the Penn State University. He is a Fellow of the Society for Industrial and Applied Mathematics (SIAM), the American Mathematical Society (AMS), the American Association for the Advancement of Science (AAAS), the European Academy of Sciences (EURASC), and the Academia Europaea. He was an invited speaker at the International Congress for Industrial and Applied Mathematics in 2007 as well as at the International Congress for Mathematicians in 2010.

Xu serves on the editorial boards of many major journals in computational mathematics and co-edits many conference proceedings and research monographs. He also serves on various college and departmental committees and organizes numerous colloquiums and seminars. He has organized or served as a scientific committee member for more than 65 international conferences, workshops, and summer schools.

Xu received his bachelor's degree from the Xiangtan University in 1982, his master's degree from the Peking University in 1984, and his doctoral degree from the Cornell University in 1989. He joined the Pennsylvania State University (Penn State) in 1989 as assistant professor of mathematics, was promoted to associate professor in 1991, and to professor in 1995. He was named a Distinguished Professor of Mathematics in 2007, the Francis R. and Helen M. Pentz Professor of Science in 2010, and the Verne M. Willaman Professor of Mathematics in 2015.

Research Interests and Contributions

Xu is an advocate of the idea that practical applications and theoretical completeness and beauty can go together. He studies numerical methods for partial differential equations and big data, especially finite element methods, multigrid methods, and deep neural networks, for their theoretical analysis, algorithmic development, and practical applications. He is well known for many groundbreaking studies in developing, designing, and analyzing fast methods for finite element discretization and for the solution of large-scale systems of equations, including several basic theories and algorithms that bear his name: the Bramble-Pasciak-Xu (BPX) preconditioner, the Hiptmair-Xu (HX) preconditioner, the Xu-Zikatanov (XZ) identity, and the Morley-Wang-Xu (MWX) element. The BPX-preconditioner is one of the two fundamental multigrid algorithms for solving large-scale discretized partial differential equations; the HX-preconditioner, which was featured in 2008 by the U.S. Department of Energy as one of the top 10 breakthroughs in computational science in recent years, is one of the most efficient solvers for the numerical simulation of electro-magnetic problems; the XZ-identity is a basic technical tool that can be used for the design and analysis of iterative methods such as the multigrid method and the method of alternating projections; the MWX-element is the only known class of finite elements universally constructed for elliptic partial differential equations of any order in any spatial dimension. Xu has published nearly 200 research papers, including his famous SIAM Review paper “Iterative Methods by Space Decomposition and Subspace Correction” (1992) and more recently his Acta Numerica paper “Algebraic Multigrid Methods” (2017).

In recent years, Xu has focused on deep learning models, training algorithms, and their applications. With his collaborators, he has obtained a number of results in both mathematical and practical aspects of deep learning.  He showed that the most popular deep neural network (DNN) using the ReLU activation function is identical to the set of linear finite element functions used in numerical PDEs for decades. He showed that the most popular DNN for image classification, namely the convolutional neural network (CNN), is directly related to the geometric multigrid method used in numerical PDEs and in particular, he demonstrated that a minor modification of the multigrid algorithm can lead to a special CNN, known as MgNet, that can compete with and sometimes even outperform the corresponding CNN for various applications, including image classification, detection, segmentation, and forecasting problems. He has written sequence papers in which he solves several open problems or obtains the best results in the field to date in regard to the approximation properties of neural network functions, especially those using ReLU and its power as activation functions. His most recent theoretical results, supported also by numerical experiments, demonstrated that the SGD algorithms and variants are simply not appropriate as training algorithms when deep learning is used for the numerical solution of PDEs. He further studied, improved and applied a class of algorithms, known as greedy algorithms, to solving optimization problems arising from using neural networks for numerical PDEs. More recently, he has been leading a collaborative effort on the development of Arabic Large Language Model (LLM), known as AceGPT, that has demonstrated most competitive performances among all open-source LLMs for Arabic language.

Awards and Distinctions

  • Member of the Academia Europe, 2023
  • Fellow of the European Academy of Sciences, 2022
  • Fellow of the American Association for the Advancement of Science (AAAS), 2019
  • Fellow of the Inaugural class of the American Mathematical Society (AMS), 2012
  • Most Outstanding Chinese Doctoral Dissertation Supervisor Award, 2011
  • SIAM Fellow, 2011
  • 45-minute Invited Speaker, International Congress of Mathematicians, Hyderabad, India, 2010
  • Plenary speaker, 6th International Congress on Industrial and Applied Mathematics, Zurich, 2007
  • Alexander Humboldt Research Award for Senior US Scientists, Germany, 2005
  • Research Award for National Outstanding Youth (Class B), China, 2006
  •  One of the most cited mathematicians in the world identified by Thomson Reuters (originally ISI)
  • A joint work with Ralf Hiptmair cited as one of the top 10 breakthroughs in computational science in a 2008 DOE Report “Recent Significant Advances in Computational Science”