Repository logo
 

Compressing images by encoding their latent representations with relative entropy coding

Accepted version
Peer-reviewed

Type

Conference Object

Change log

Authors

Flamich, G 
Havasi, M 
Hernández-Lobato, JM 

Abstract

Variational Autoencoders (VAEs) have seen widespread use in learned image compression. They are used to learn expressive latent representations on which downstream compression methods can operate with high efficiency. Recently proposed 'bits-back' methods can indirectly encode the latent representation of images with codelength close to the relative entropy between the latent posterior and the prior. However, due to the underlying algorithm, these methods can only be used for lossless compression, and they only achieve their nominal efficiency when compressing multiple images simultaneously; they are inefficient for compressing single images. As an alternative, we propose a novel method, Relative Entropy Coding (REC), that can directly encode the latent representation with codelength close to the relative entropy for single images, supported by our empirical results obtained on the Cifar10, ImageNet32 and Kodak datasets. Moreover, unlike previous bits-back methods, REC is immediately applicable to lossy compression, where it is competitive with the state-of-the-art on the Kodak dataset.

Description

Keywords

Journal Title

Advances in Neural Information Processing Systems

Conference Name

NeurIPS 2020 Thirty-fourth Conference on Neural Information Processing Systems

Journal ISSN

1049-5258

Volume Title

2020-December

Publisher

Rights

All rights reserved