Title :
Entropy based merging of context models for efficient arithmetic coding
Author_Institution :
Inst. of Commun. Eng., Leipzig Univ. of Telecommun., Leipzig, Germany
Abstract :
The contextual coding of data requires in general a step which reduces the vast variety of possible contexts down to a feasible number. This paper presents a new method for non-uniform quantisation of contexts, which adaptively merges adjacent intervals as long as the increase of the contextual entropy is negligible. This method is incorporated in a framework for lossless image compression. In combination with an automatic determination of model sizes for histogram-tail truncation, the proposed approach leads to a significant gain in compression performance for a wide range of different natural images.
Keywords :
arithmetic codes; data compression; entropy codes; image coding; arithmetic coding; contextual entropy based merging coding; histogram-tail truncation; lossless image compression; nonuniform context quantisation; Adaptation models; Context; Context modeling; Encoding; Entropy; Image coding; Quantization (signal); arithmetic coding; context quantisation; image compression; modelling;
Conference_Titel :
Acoustics, Speech and Signal Processing (ICASSP), 2014 IEEE International Conference on
Conference_Location :
Florence
DOI :
10.1109/ICASSP.2014.6853947