Prof.Patrick Farrell, University of Oxford
Monday, December 05, 2022, 12:00
- 13:00
Building 9, Level 2, Room 2322, Hall 1
Building on the work of Schöberl, Olshanskii, and Benzi, in this talk we present the first preconditioner for the Newton linearization of the stationary Navier--Stokes equations in three dimensions that achieve both optimal complexity in of count and Reynolds-robustness. The exact details of the preconditioner varies with discretization, but the general theme is to combine augmented Lagrangian stabilisation, a custom multigrid prolongation operator involving local solves on coarse cells, and an additive patchwise relaxation on each level that captures the kernel of the divergence operator.
Dr.Syed Adnan Yusuf
Monday, November 28, 2022, 12:00
- 13:00
Building 9, Level 2, Room 2322, Hall 1
This seminar focuses on providing the audience with the context and scope of our internship program. The program is for the young and talented graduate students with an active interest in solving real-world problems. Some of the projects that will be presented in the seminar are actively developed in Elm and include domains such as computer vision, robotics and automation, healthcare, IoT, video analytics, and NLP. The seminar will serve as a launch pad to allow students to discuss their future interests and aspirations with the speaker. It will also enable them to develop a better awareness of domains more relevant to their future research aspirations.
Monday, November 21, 2022, 12:00
- 13:00
Building 9, Level 2, Room 2322, Hall 1
Contact Person
In this talk, I will first give a convergence analysis of gradient descent (GD) method for training neural networks by relating them with finite element method. I will then present some acceleration techniques for GD method and also give some alternative training algorithms
Francesco Orabona, Associate Professor of Electrical and Computer Engineering, Boston University
Monday, November 14, 2022, 12:00
- 13:00
Building 9, Level 2, Room 2322
Contact Person
Parameter-free online optimization is a class of algorithms that does not require tuning hyperparameters, yet they achieve the theoretical optimal performance. Moreover, they often achieve state-of-the-art performance too. An example would be gradient descent algorithms completely without learning rates. In this talk, I review my past and present contributions to this field. Building upon a fundamental idea connecting optimization, gambling, and information theory, I discuss selected applications of parameter-free algorithms to machine learning and statistics. Finally, we conclude with an overview of the future directions of this field.
Prof. Michal Mankowski, Assistant Professor of Operations Research, Erasmus University Rotterdam, Netherlands
Thursday, November 10, 2022, 10:00
- 11:30
Building 1, Level 3, Room 3119
The aim of this course is to familiarize the students with the usage of Computer Simulation tools for complex problems. The course will introduce the basic concepts of computation through modeling and simulation that are increasingly being used in industry and academia. The basic concepts of Discrete Event Simulation will be introduced along with the reliable methods of random variate generation and variance reduction. Later in the course, the concept of simulation-based optimization and output analysis will be discussed. The example of simulation (and optimization) applied to design an optimal organ allocation policy in the US will be discussed.
Prof. Michal Mankowski, Assistant Professor of Operations Research, Erasmus University Rotterdam, Netherlands
Wednesday, November 09, 2022, 10:00
- 11:30
Building 1, Level 3, Room 3119
The aim of this course is to familiarize the students with the usage of Computer Simulation tools for complex problems. The course will introduce the basic concepts of computation through modeling and simulation that are increasingly being used in industry and academia. The basic concepts of Discrete Event Simulation will be introduced along with the reliable methods of random variate generation and variance reduction. Later in the course, the concept of simulation-based optimization and output analysis will be discussed. The example of simulation (and optimization) applied to design an optimal organ allocation policy in the US will be discussed.
Tobias Isenberg, Senior Research Scientist, Inria
Monday, November 07, 2022, 12:00
- 13:00
Building 9, Level 2, Room 2322, Hall 1
Contact Person
In this talk I will report on various research projects that I carried out with my students to better understand the interaction landscape and will report on lessons we learned. I will focus mostly on AR-based setups with application examples from physical flow visualization, molecular visualization, visualization of particle collisions, biomolecular dynamics in cells, and oceanography. I will show interaction techniques that rely on purely gestural interaction, phones or tablets as input and control devices, and hybrid setups that combine traditional workstations with AR views. I will discuss navigation, data selection, and visualization system control as different interaction tasks. With this overview I aim to provide an understanding of typical challenges in immersive visualization environments and how to address some of these challenges.
Monday, October 31, 2022, 12:00
- 13:00
Building 9, Level 2, Room 2322, Hall 1
Contact Person
From my experience, I will try to answer doubts and dilemmas PhD students are often faced with, in their path towards a degree. Namely, I'll discuss how advisors, colleagues, peers, reviewers and so forth, fit in the universe of a PhD student, and I will end sharing my own definition of 'excellence', as an objective to pursue.
Prof.Evgeny Burnaev, Applied AI Center, Skolkovo Institute of Science and Technology
Monday, October 24, 2022, 12:00
- 13:00
Building 9, Level 2, Room 2322, Hall 1
Contact Person
Skoltech Applied AI center’s mission is to create AI models and frameworks for solving the problems of sustainable development of industry and economy. In my presentation, I will overview the current center's activities, applied and fundamental problem statements, and corresponding recent results.
Monday, October 10, 2022, 12:00
- 13:00
Building 9, Level 2, Room 2322, Hall 1
Contact Person
In the big data era, it is necessary to rely on distributed computing. For distributed optimization and learning tasks, in particular in the modern paradigm of federated learning, specific challenges arise, such as decentralized data storage. Communication between the parallel machines and the orchestrating distant server is necessary but slow. To address this main bottleneck, a natural strategy is to compress the communicated vectors. I will present EF-BV, a new algorithm which converges linearly to an exact solution, with a large class of deterministic or random, biased or unbiased compressors.
Monday, October 03, 2022, 12:00
- 13:00
Building 9, Level 2, Room 2322, Hall 1
Contact Person
Random fields are popular models in statistics and machine learning for spatially dependent data on Euclidian domains. However, in many applications, data is observed on non-Euclidian domains such as street networks. In this case, it is much more difficult to construct valid random field models. In this talk, we discuss some recent approaches to modeling data in this setting, and in particular define a new class of Gaussian processes on compact metric graphs.
Prof. Dhabaleswar K. (DK) Panda, Professor, Computer Science and Engineering, The Ohio State University
Monday, September 26, 2022, 12:00
- 13:00
Building 9, Level 2, Room 2322, Hall 1
Contact Person
This talk will focus on challenges and opportunities in designing middleware for HPC, AI (Deep/Machine Learning), and Data Science. We will start with the challenges in designing runtime environments for MPI+X programming models by considering support for multi-core systems, high-performance networks (InfiniBand and RoCE), GPUs, and emerging BlueField-2 DPUs. Features and sample performance numbers of using the MVAPICH2 libraries will be presented. For the Deep/Machine Learning domain, we will focus on MPI-driven solutions to extract performance and scalability for popular Deep Learning frameworks (TensorFlow and PyTorch), large out-of-core models, and Bluefield-2 DPUs.
Fahad Khan, Associate Professor at MBZUAI and Linköping University
Monday, September 19, 2022, 12:00
- 13:00
Building 9, Level 2, Room 2322
Contact Person
Machine perception that corresponds to the ability to understand the visual world based on the input from sensors, such as cameras is one of the central problems in Artificial Intelligence. To this end, recent years have witnessed tremendous progress in various instance-level recognition tasks having real-world applications in e.g., robotics, autonomous driving and surveillance. In this talk, I will first present our recent results towards understanding state-of-the-art deep learning-based visual recognition networks in terms of their robustness and generalizability. Next, I will present our results on learning visual recognition models with limited human supervision. Finally, I will discuss moving one step further from instance-level recognition to understand visual relationships between object pairs.
Wednesday, September 14, 2022, 16:00
- 18:30
Building 5, Room 5220
Contact Person
In this thesis, we discuss a few fundamental and well-studied optimization problem classes: decentralized distributed optimization (Chapters 2 to 4), distributed optimization under similarity (Chapter 5), affinely constrained optimization (Chapter 6), minimax optimization (Chapter 7), and high-order optimization (Chapter 8). For each problem class, we develop the first provably optimal algorithm: the complexity of such an algorithm cannot be improved for the problem class given. The proposed algorithms show state-of-the-art performance in practical applications, which makes them highly attractive for potential generalizations and extensions in the future.
Monday, September 05, 2022, 12:00
- 13:00
Building 9, Level 2, Room 2322, Hall 1
Contact Person
In this talk, I will discuss communication compression and aggregation mechanisms for curvature information in order to reduce these costs while preserving theoretically superior local convergence guarantees.
Monday, June 27, 2022, 18:00
- 20:00
Building 5, Level 5, Room 5209
Contact Person
Federated learning (FL) is an emerging machine learning paradigm involving multiple clients, e.g., mobile phone devices, with an incentive to collaborate in solving a machine learning problem coordinated by a central server. FL was proposed in 2016 by Konecny et al. and McMahan et al. as a viable privacy-preserving alternative to traditional centralized machine learning since, by construction, the training data points are decentralized and never transferred by the clients to a central server. Therefore, to a certain degree, FL mitigates the privacy risks associated with centralized data collection. Unfortunately, optimization for FL faces several specific issues that centralized optimization usually does not need to handle. In this thesis, we identify several of these challenges and propose new methods and algorithms to address them, with the ultimate goal of enabling practical FL solutions supported with mathematically rigorous guarantees.
Monday, May 16, 2022, 12:00
- 13:00
Building 9, Room 2322, Hall 1
Contact Person
Datasets that capture the connection between vision, language, and affection are limited, causing a lack of understanding of the emotional aspect of human intelligence. As a step in this direction, the ArtEmis dataset was recently introduced as a large-scale dataset of emotional reactions to images along with language explanations of these chosen emotions.
Monday, May 09, 2022, 12:00
- 13:00
https://kaust.zoom.us/j/98631999457
Contact Person
Hydrogen is a carbon-free energy carrier that can be used to decarbonize various high-emitting sectors, such as transportation, power generation, and industry. Today, global hydrogen production is largely derived from fossil fuels such as natural gas and coal.
Monday, April 25, 2022, 12:00
- 13:00
Building 9, Room 2322, Hall 1
Contact Person
Differential Privacy (DP) allows for rich statistical and machine learning analysis, and is now becoming a gold standard for private data analysis. Despite the noticeable success of this theory, existing tools from DP are severely limited to regular datasets, e.g., datasets need to be or are assumed to be clean and normalized before performing DP algorithms.
Monday, April 18, 2022, 12:00
- 13:00
Building 9, Room 2322 Lecture Hall #1
Contact Person
The power system is facing unprecedented changes in operation and control as more and diverse sources and loads are being connected to this complex cyber-physical energy system.
Monday, April 11, 2022, 12:00
- 13:00
Building 9, Room 2322 Lecture Hall #1
Contact Person
Geospatial health data are essential to inform public health and policy. These data can be used to quantify disease burden, understand geographic and temporal patterns, identify risk factors and measure inequalities. In this talk, I will give an overview of statistical methods and computational tools for geospatial data analysis and health surveillance.
Monday, April 04, 2022, 12:00
- 13:00
Building 9, Room 2322, Hall 1
Contact Person
DNA Nanotechnology is a fascinating field that studies how to construct small biological structures entirely from DNA as a building material. The key insight is that DNA, if designed in a particular way, can construct complex 3D nanoscale structures entirely by means of self-assembly, governed by the base-pairing principle.
Monday, March 28, 2022, 12:00
- 13:00
Building 9, Room 2322, Lecture Hall #1
Contact Person
Traditional computing systems separate processors from memory, performing computation by shuttling data back and forth between these two units all the time. This bottleneck incurs limited processing speed and high power consumption in computing systems for deep learning models of ever-increasing complexity. Novel approaches and new principles are needed to revolutionize computing systems. Neuromorphic systems are proposed as a new computing architecture based on spiking neural networks analogous to the existing nervous systems.
Monday, March 21, 2022, 12:00
- 13:00
Building 9, Room 2322 Lecture Hall #1
Contact Person
We study the MARINA method of Gorbunov et al (ICML 2021) - the current state-of-the-art distributed non-convex optimization method in terms of theoretical communication complexity. Theoretical superiority of this method can be largely attributed to two sources: the use of a carefully engineered biased stochastic gradient estimator, which leads to a reduction in the number of communication rounds, and the reliance on {\em independent} stochastic communication compression operators, which leads to a reduction in the number of transmitted bits within each communication round.