• DocumentCode
    178174
  • Title

    Entropy based merging of context models for efficient arithmetic coding

  • Author

    Strutz, Tilo

  • Author_Institution
    Inst. of Commun. Eng., Leipzig Univ. of Telecommun., Leipzig, Germany
  • fYear
    2014
  • fDate
    4-9 May 2014
  • Firstpage
    1991
  • Lastpage
    1995
  • Abstract
    The contextual coding of data requires in general a step which reduces the vast variety of possible contexts down to a feasible number. This paper presents a new method for non-uniform quantisation of contexts, which adaptively merges adjacent intervals as long as the increase of the contextual entropy is negligible. This method is incorporated in a framework for lossless image compression. In combination with an automatic determination of model sizes for histogram-tail truncation, the proposed approach leads to a significant gain in compression performance for a wide range of different natural images.
  • Keywords
    arithmetic codes; data compression; entropy codes; image coding; arithmetic coding; contextual entropy based merging coding; histogram-tail truncation; lossless image compression; nonuniform context quantisation; Adaptation models; Context; Context modeling; Encoding; Entropy; Image coding; Quantization (signal); arithmetic coding; context quantisation; image compression; modelling;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Acoustics, Speech and Signal Processing (ICASSP), 2014 IEEE International Conference on
  • Conference_Location
    Florence
  • Type

    conf

  • DOI
    10.1109/ICASSP.2014.6853947
  • Filename
    6853947