Data compression and communication is of enormous societal and environmental impact and stands to benefit from the machine learning revolution seen in other fields. This tutorial disseminates the ideas from information theory and learned/neural compression to a broader machine learning and AI audience.

Neural compression is the application of neural networks and other machine learning methods to data compression. As David McKay wrote in the early 2000s, information theory and machine learning “are two sides of the same coin”. Indeed, the basic questions in information theory, such as channel coding and source coding (i.e., data compression), have deep connections to topics in statistical machine learning such as maximum-likelihood estimation, variational inference, and hypothesis testing. In the passing decades, advances in statistical machine learning have opened up new possibilities for data compression, allowing compression algorithms to be learned end-to-end from data using powerful generative models such as normalizing flows, variational autoencoders, diffusion probabilistic models, and generative adversarial networks.

Building on these powerful generative models, this tutorial reviews the core principles and techniques for lossless and lossy compression with machine learning, and highlights the intimate connections between compression and statistical modeling. Besides treating standard problems in data compression, such as image or video compression compression under a rate-distortion criterion, the tutorial provides an introduction to several emerging frontiers of neural compression, such as perceptual compression with extremely low bit-rates, lossy compression for down-stream tasks, and the compression of new digital media such as point clouds.


  • Recording from NeurIPS 2022 (event page):

    You can also find the slides here and the videos on YouTube.

  • An Introduction to Neural Data Compression, tutorial article by Yibo Yang, Stephan Mandt, and Lucas Theis.

Speakers / Organizers

Karen Ullrich
Meta AI
Yibo Yang
University of California, Irvine
Stephan Mandt
University of California, Irvine


Virginia Smith
Carnegie Mellon University
Michele Covell
Daniel Severo
University of Toronto
Christopher Schroers
Disney Research | Studios

Organizers affiliations