Data compression is a problem of great practical importance and a new frontier for machine learning research that combines empirical findings with fundamental theoretical insights. The goal of this workshop is to bring together researchers from deep learning, information theory, and probabilistic modeling in oder to learn from each other and to encourage exchange on fundamentally novel issues related to neural data compression.

Topics

The workshop solicits work related to neural data compression ranging from information-theoretical aspects to solving issues related to applications.

Submissions are invited for topics on, but not limited to:

  • Image/Video/Audio Compression with AutoEncoders, Flows, AutoRegressive models, Generative Adversarial Networks, etc.
  • Neural Network Compression
  • Probabilistic Modeling and Variational Inference for Compression
  • Entropy Coding
  • Minimal Description Length Theory
  • Information Theory and Source Coding

    Important Dates

  • Submission deadline: Feb 26, 2021 extended to Feb 28, 2021 (11:59pm anywhere on earth)
  • Notification data: March 26, 2021
  • New Workshop date: May 8, 2021 May 7, 2021 (please note that ICLR has been pushed up by one day)

Submission Instructions

We solicit short workshop paper submissions of up to 4 pages + unlimited references/appendices. Please format submissions in ICLR style. Submissions will be double blind: reviewers cannot see author names when conducting reviews, and authors cannot see reviewer names.

Some accepted papers will be accepted as contributed talks. All accepted papers will be given a slot in the poster presentation session and published via Openreview after the workshop.

Papers can be submitted through OpenReview. Please send any further inquiries by email to the organizers at neural.compression.workshop@gmail.com.

Please consider forwarding this workshop information to your colleagues or friends.