Title :
Context selection and quantization for lossless image coding
Author_Institution :
Dept. of Comput. Sci., Univ. of Western Ontario, London, Ont., Canada
Abstract :
Summary form only given. After the context quantization, an entropy coder using L2K (L is the quantized levels and K is the number of bits) conditional probabilities remains impractical. Instead, only the expectations are approximated by the sample means with respect to different quantized contexts. Computing the sample means involves only cumulating the error terms in the quantized context C(d,t) and keeping a count on the occurrences of C(d,t). Thus, the time and space complexities of the described context based modeling of the prediction errors are O(L2K). Based on the quantized context C(d,t), the encoder makes a DPCM prediction I, adds to I the most likely prediction error and then arrives at an adaptive, context-based, nonlinear prediction. The error e is then entropy coded. The coding of e is done with L conditional probabilities. The results of the proposed context-based, lossless image compression technique are included
Keywords :
data compression; differential pulse code modulation; image coding; prediction theory; probability; quantisation (signal); DPCM prediction; adaptive nonlinear prediction; conditional probabilities; context based modeling; context quantization; context selection; entropy coder; error terms; expectations; lossless image coding; lossless image compression; prediction errors; quantized contexts; sample means; space complexity; time complexity; Computer science; Context modeling; Entropy; Gold; Image coding; Linear regression; Pixel; Predictive models; Quantization;
Conference_Titel :
Data Compression Conference, 1995. DCC '95. Proceedings
Conference_Location :
Snowbird, UT
Print_ISBN :
0-8186-7012-6
DOI :
10.1109/DCC.1995.515563