Repository logo
 

Sharp Second-Order Pointwise Asymptotics for Lossless Compression with Side Information.

Published version
Peer-reviewed

Change log

Authors

Gavalakis, Lampros 
Kontoyiannis, Ioannis  ORCID logo  https://orcid.org/0000-0001-7242-6375

Abstract

The problem of determining the best achievable performance of arbitrary lossless compression algorithms is examined, when correlated side information is available at both the encoder and decoder. For arbitrary source-side information pairs, the conditional information density is shown to provide a sharp asymptotic lower bound for the description lengths achieved by an arbitrary sequence of compressors. This implies that for ergodic source-side information pairs, the conditional entropy rate is the best achievable asymptotic lower bound to the rate, not just in expectation but with probability one. Under appropriate mixing conditions, a central limit theorem and a law of the iterated logarithm are proved, describing the inevitable fluctuations of the second-order asymptotically best possible rate. An idealised version of Lempel-Ziv coding with side information is shown to be universally first- and second-order asymptotically optimal, under the same conditions. These results are in part based on a new almost-sure invariance principle for the conditional information density, which may be of independent interest.

Description

Keywords

central limit theorem, conditional entropy, conditional varentropy, entropy, law of the iterated logarithm, lossless data compression, side information

Journal Title

Entropy (Basel)

Conference Name

Journal ISSN

1099-4300
1099-4300

Volume Title

22

Publisher

MDPI AG
Sponsorship
Engineering and Physical Sciences Research Council (RG94782)
Hellenic Foundation for Research and Innovation (-)