DocumentCode :
2937612
Title :
Generalized vector quantization: jointly optimal quantization and estimation
Author :
Rao, Ajit ; Miller, David ; Rose, Kenneth ; Gersho, Allen
Author_Institution :
Dept. of Electr. & Comput. Eng., California Univ., Santa Barbara, CA, USA
fYear :
1995
fDate :
17-22 Sep 1995
Firstpage :
432
Abstract :
Given a pair of random vectors X, Y, we study the problem of finding an efficient or optimal estimator of Y given X when the range of the estimator is constrained to be a finite set of values. A generalized vector quantizer (GVQ), with input dimension k, output dimension m, and size N maps input X∈ℛk, to output V(X)∈ℛ m. The output V(X) is constrained to be one of the estimation codevectors in the codebook, {y1,y2...yN}. The performance of the GVQ is measured by the average distortion, D=E[d(Y,V(X))] for a suitable output-space distortion measure d(.,.). A GVQ reduces to a conventional vector quantizer in the special case where X=Y. The GVQ problem has been approached in the information theory literature from many different standpoints. In particular, it appears in the context of noisy source coding, which is the special case where we quantize X, the observable, noisy version of a source, Y
Keywords :
estimation theory; noise; optimisation; random processes; source coding; vector quantisation; average distortion; codebook; estimation codevectors; generalized vector quantization; information theory; input dimension; noisy source coding; optimal estimation; optimal quantization; output dimension; output-space distortion measure; performance; Books; Constraint optimization; Decoding; Distortion measurement; Entropy; Information theory; Lagrangian functions; Prototypes; Source coding; Vector quantization;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Information Theory, 1995. Proceedings., 1995 IEEE International Symposium on
Conference_Location :
Whistler, BC
Print_ISBN :
0-7803-2453-6
Type :
conf
DOI :
10.1109/ISIT.1995.550419
Filename :
550419
Link To Document :
بازگشت