Neural Compression @ICLR 2021
Data compression is a problem of great practical importance and a new frontier for machine learning research that combines empirical findings with fundamental theoretical insights. The goal of this workshop is to bring together researchers from deep learning, information theory, and probabilistic modeling in oder to learn from each other and to encourage exchange on fundamentally novel issues related to neural data compression.
The workshop solicits work related to neural data compression ranging from information-theoretical aspects to solving issues related to applications.
Submissions are invited for topics on, but not limited to:
- Image/Video/Audio Compression with AutoEncoders, Flows, AutoRegressive models, Generative Adversarial networks, etc.
- Neural Network Compression
- Probabilistic Modeling and Variational Inference for Compression
- Entropy Coding
- Minimal Description Length Theory
- Information Theory and Source Coding
- Submission deadline: Feb 26, 2021
- Notification data: March 26, 2021
- Workshop date: May 8, 2021
We solicit short workshop paper submissions of up to 4 pages + unlimited references/appendices. Please format submissions in ICLR style.
Some accepted papers will be accepted as contributed talks. All accepted papers will be given a slot in the poster presentation session.
Hebrew University of Jerusalem
|Rianne van den Berg
Senior Research Scientist, Google
University of Texas at Austin
- Stephan Mandt, University of California, Irvine
- Robert Bamler, University of California, Irvine
- Yingzhen Li, Microsoft Research Cambridge
- Christopher Schroers, Disney Research Studios
- Max Welling, University of Amsterdam; Qualcomm AI Research
- Yang Yang, Qualcomm AI Research
- Taco Cohen, Qualcomm AI Research
Please consider forwarding this workshop information to your colleagues or friends.