DocumentCode :
991830
Title :
Sequential and parallel neural network vector quantizers
Author :
Parhi, Keshab K. ; Wu, Frank H. ; Genesan, Kalyan
Author_Institution :
US West Adv. Technol. Inc., Boulder, CO, USA
Volume :
43
Issue :
1
fYear :
1994
fDate :
1/1/1994 12:00:00 AM
Firstpage :
104
Lastpage :
109
Abstract :
Presents novel sequential and parallel learning techniques for codebook design in vector quantizers using neural network approaches. These techniques are used in the training phase of the vector quantizer design. These learning techniques combine the split-and-cluster methodology of the traditional vector quantizer design with neural learning, and lead to better quantizer design (with fewer distortions). The sequential learning approach overcomes the code word underutilization problem of the competitive learning network. As a result, this network only requires partial or zero updating, as opposed to full neighbor updating as needed in the self organizing feature map. The parallel learning network, while satisfying the above characteristics, also leads to parallel learning of the codewords. The parallel learning technique can be used for faster codebook design in a multiprocessor environment. It is shown that this sequential learning scheme can sometimes outperform the traditional LBG algorithm, while the parallel learning scheme performs very close to the LGB and the sequential learning algorithms
Keywords :
learning (artificial intelligence); neural nets; parallel algorithms; codebook design; competitive learning; neural learning; parallel learning; parallel learning techniques; parallel neural network; self organizing feature map; sequential learning; vector quantizers; Availability; Clustering algorithms; Data compression; Frequency; Neural networks; Signal processing; Speech; Transmitters; Vector quantization; Video compression;
fLanguage :
English
Journal_Title :
Computers, IEEE Transactions on
Publisher :
ieee
ISSN :
0018-9340
Type :
jour
DOI :
10.1109/12.250614
Filename :
250614
Link To Document :
بازگشت