Post-Doc Aku Jakko Kammonen participation at UNCECOMP 2023, 5th International Conference on Uncertainty Quantification in Computational Science and Engineering

A postdoctoral fellow of our group Dr. Aku Jaakko Kammonen participated in the recently concluded UNCECOMP 2023, 5th International Conference on Uncertainty Quantification in Computational Science and Engineering and presented a talk on Adaptive random Fourier features based on Metropolis sampling. The Conference held at Athens, Greece, between June 12-14, 2023.

Abstract:

 The supervised learning problem to approximate a function f by a neural network approximation with one hidden layer of width K is studied as a random Fourier features algorithm. Here the mean square loss problem can be solved easily, since it is convex in the amplitude parameters for independent frequencies. It is also well known that the corresponding

generalization error is bounded by a constant times 1/K. In my talk I will first show how the constant in the bound can be minimized by optimally choosing a specific density and then how to approximately sample from this density, only using the data and certain adaptive Metropolis steps. I will also show results with other activation functions.

[1] Kammonen, Aku and Kiessling, Jonas and Plechac, Petr and Sandberg, Mattias and Szepessy, Anders. Adaptive random Fourier features with Metropolis sampling. Foundations of Data Science, 2020.

[2] Kammonen, Aku and Kiessling, Jonas and Plechac, Petr and Sandberg, Mattias and Szepessy, Anders and Tempone, Raul. Smaller generalization error derived for a deep residual neural network compared with shallow networks. IMA Journal of Numerical Analysis, 2022.

[2] Kammonen, Aku and Kiessling, Jonas and Plechac, Petr and Sandberg, Mattias and Szepessy, Anders and Tempone, Raul. Smaller generalization error derived for a deep residual neural network compared with shallow networks. IMA Journal of Numerical Analysis, 2022.