This talk is about building compilers for high-performance code generation. It has three parts. The first part is about Tiramisu, a polyhedral compiler designed for generating highly efficient code for multicores and GPUs. It is the first polyhedral compiler that can match the performance of highly hand-optimized industrial libraries such as Intel MKL and cuDNN. The second part is about applying Tiramisu to accelerate deep learning (DNN) inference. In comparison to other DNN compilers, Tiramisu has two unique features: (1) it supports sparse DNNs; and (2) it can express and optimize general RNNs (Recurrent Neural Networks). The third part will present recent work on the problem of automatic code optimization. In particular, it will focus on using deep learning to build a cost model to explore the search space of code optimizations.
Riyadh Baghdadi is a postdoctoral associate at MIT. He works on the intersection of compilers and applied machine learning. More precisely, he works on developing compilers that take high-level code and optimize it automatically to generate highly efficient code. He uses machine learning to automate optimizations in these compilers. Before joining MIT, Riyadh obtained his Ph.D. and master's degrees from INRIA, France (Sorbonne University, Paris VI).