
Adapting Foundation Models: From Reinforcement Learning to Multivariate Time Series Forecasting
This talk explores adapting foundation models for reinforcement learning and multivariate time series forecasting, introducing DICL for handling multivariate data in RL and AdaPTS for probabilistic time series forecasting, both showing significant performance improvements over baseline methods.
Overview
Foundation models (FMs) have revolutionized various domains by leveraging pre-trained knowledge to tackle complex tasks. This talk explores the adaptation of FMs in two distinct yet interconnected areas: Reinforcement Learning (RL) and multivariate time series forecasting. First, we delve into the integration of Large Language Models (LLMs) as FMs in RL, focusing on their zero-shot capabilities in predicting the dynamics of continuous Markov decision processes. We address the challenges of handling multivariate data and incorporating control signals, and present **Disentangled In-Context Learning (DICL)**. Extending this concept, we then explore the adaptation of univariate deterministic forecasting FMs to probabilistic multivariate time series forecasting. We introduce **AdaPTS**, which uses probabilistic adapters to transform multivariate inputs into a suitable latent space, enabling the effective use of pre-trained univariate FMs. In both applications, we compare our methods against common baselines, including the direct application of the respective FMs, demonstrating significant improvements in key performance metrics for each use case.
Presenters
Abdelhakim Benechehab, Ph.D. Student, EURECOM, France
Brief Biography
Abdelhakim Benechehab is a second-year Ph.D. student at EURECOM Sophia Antipolis and working in Huawei Noah’s Ark Lab in Paris. His research focuses on model-based reinforcement learning, particularly in improving dynamics models for long-horizon planning and incorporating uncertainty estimation. Additionally, he is interested in foundation models, exploring their applications in reinforcement learning and dynamics modeling. Abdelhakim earned his master’s degree from ENS Paris-Saclay in 2021 through the Mathematics, Vision, and Machine Learning (MVA) program and holds an engineering degree in mathematics and computer science from École des Mines.