Title :
A locally adaptive perceptual masking threshold model for image coding
Author_Institution :
Dept. of Electr. & Comput. Eng., Wisconsin Univ., WI, USA
Abstract :
This paper involves designing, implementing, and testing of a locally adaptive perceptual masking threshold model for image compression. This model computes, based on the contents of the original images, the maximum amount of noise energy that can be injected at each transform coefficient that results in perceptually distortion-free still images or sequences of images. The adaptive perceptual masking threshold model can be used as a pre-processor to a JPEG compression standard image coder. DCT coefficients less than their corresponding perceptual thresholds can be set to zero before the normal JPEG quantization and Huffman coding steps. The result is an image-dependent gain in the bit rate needed for transparent coding. In an informal subjective test involving 318 still images in the AT&T Bell Laboratory image database, this model provided a gain in bit-rate saving on the order of 10 to 30%
Keywords :
data compression; discrete cosine transforms; filtering theory; image coding; image sequences; transform coding; visual perception; DCT coefficients; JPEG compression standard image coder; cortex filters; image coding; image compression; image sequences; image-dependent gain; informal subjective test; locally adaptive perceptual masking threshold model; noise energy; perceptually distortion-free still images; transform coefficient; transparent coding; Bit rate; Code standards; Discrete cosine transforms; Huffman coding; Image coding; Image databases; Masking threshold; Quantization; Testing; Transform coding;
Conference_Titel :
Acoustics, Speech, and Signal Processing, 1996. ICASSP-96. Conference Proceedings., 1996 IEEE International Conference on
Conference_Location :
Atlanta, GA
Print_ISBN :
0-7803-3192-3
DOI :
10.1109/ICASSP.1996.544817