
Latent Abstractions, Mutual Information and Generative Diffusion Models
This talk presents a mathematical framework linking stochastic differential equations and information theory on how diffusion models utilize latent abstractions via implicit stochastic filtering to synthesize high-dimensional data.
Overview
Diffusion-based generative models have achieved remarkable success in synthesizing high-dimensional data, yet a key question remains: how do they encode and leverage latent abstractions to guide the generative process? In this talk, we will provide an introduction to a mathematical framework that connects stochastic differential equations (SDEs) and information theory to address this question.
The core idea is that diffusion models implicitly perform a form of stochastic filtering, where an evolving latent state steers the dynamics of an observable process. Our discussion will highlight how this formalism sheds light on several fundamental mechanisms underlying generative models and offers a promising avenue for future research.
Presenters
Giulio Franzese, Assistant Professor, Data Science Department, EURECOM
Brief Biography
Giulio Franzese is an Assistant Professor in the Data Science department of EURECOM working on problems related to methodological and theoretical advancements in Machine Learning. He received both his Master and PhD from Politecnico di Torino in Electronic and Telecommunications Engineering in 2016 and 2021 respectively.
His current topics of interest include Generative Models and Information Theoretic approaches to machine learning. In the past, he worked on signal processing applied to Satellite Navigation Systems. He is co-author of more than 20 scientific papers and industrial patents.