DocumentCode :
1109173
Title :
Gradient algorithms for designing predictive vector quantizers
Author :
Chang, Pao-Chi ; Gray, Robert M.
Author_Institution :
Stanford University, Stanford, CA
Volume :
34
Issue :
4
fYear :
1986
fDate :
8/1/1986 12:00:00 AM
Firstpage :
679
Lastpage :
690
Abstract :
A predictive vector quantizer (PVQ) is a vector extension of a predictive quantizer. It consists of two parts: a conventional memoryless vector quantizer (VQ) and a vector predictor. Two gradient algorithms for designing a PVQ are developed in this paper: the steepest descent (SD) algorithm and the stochastic gradient (SG) algorithm. Both have the property of improving the quantizer and the predictor in the sense of minimizing the distortion as measured by the average mean-squared error. The differences between the two design approaches are the period and the step size used in each iteration to update the codebook and predictor. The SG algorithm updates once for each input training vector and uses a small step size, while the SD updates only once for a long period, possibly one pass over the entire training sequence, and uses a relatively large step size. Code designs and tests are simulated for both Gauss-Markov sources and for sampled speech waveforms, and the results are compared to codes designed using techniques that attempt to optimize only the quantizer for the predictor and not vice versa.
Keywords :
Algorithm design and analysis; Distortion measurement; Feedback; Prediction algorithms; Quantization; Signal design; Signal processing algorithms; Stochastic processes; Testing; Vectors;
fLanguage :
English
Journal_Title :
Acoustics, Speech and Signal Processing, IEEE Transactions on
Publisher :
ieee
ISSN :
0096-3518
Type :
jour
DOI :
10.1109/TASSP.1986.1164905
Filename :
1164905
Link To Document :
بازگشت