Data compression is a problem of great practical importance and a new frontier for machine learning research that combines empirical findings with fundamental theoretical insights. The goal of this workshop is to bring together researchers from deep learning, information theory, and probabilistic modeling in oder to learn from each other and to encourage exchange on fundamentally novel issues related to neural data compression.

Topics

The workshop solicits work related to neural data compression ranging from information-theoretical aspects to solving issues related to applications.

Submissions are invited for topics on, but not limited to:

  • Image/Video/Audio Compression with AutoEncoders, Flows, AutoRegressive models, Generative Adversarial networks, etc.
  • Neural Network Compression
  • Probabilistic Modeling and Variational Inference for Compression
  • Entropy Coding
  • Minimal Description Length Theory
  • Information Theory and Source Coding

Important Dates

  • Submission deadline: Feb 26, 2021
  • Notification data: March 26, 2021
  • Workshop date: May 8, 2021

Submission Instructions

We solicit short workshop paper submissions of up to 4 pages + unlimited references/appendices. Please format submissions in ICLR style.

Some accepted papers will be accepted as contributed talks. All accepted papers will be given a slot in the poster presentation session.

Paper submissions should be made through OpenReview and further information will be available at CFP. Please send your inquiries by email to the organizers at neural.compression.workshop@gmail.com.

Speakers

Fabian Mentzer
ETH Zurich
Naftali Tishby
Hebrew University of Jerusalem
Rianne van den Berg
Google Brain
Oren Rippel
WaveOne
Jonathan Ho
Google
Johannes Ballé
Google

Panelists

Alex Alemi
Senior Research Scientist, Google
Ferenc Huszár
Cambridge University
Philipp Krähenbühl
University of Texas at Austin
Irina Higgins
DeepMind

Organizers

Please consider forwarding this workshop information to your colleagues or friends.