DocumentCode :
1442069
Title :
A class of competitive learning models which avoids neuron underutilization problem
Author :
Choy, Clifford Sze-Tsan ; Siu, Wan-chi
Author_Institution :
Dept. of Electron. & Inf. Eng., Hong Kong Polytech., Hung Hom, Hong Kong
Volume :
9
Issue :
6
fYear :
1998
fDate :
11/1/1998 12:00:00 AM
Firstpage :
1258
Lastpage :
1269
Abstract :
We study a qualitative property of a class of competitive learning (CL) models, which is called the multiplicatively biased competitive learning (MBCL) model, namely that it avoids neuron underutilization with probability one as time goes to infinity. In the MBCL, the competition among neurons is biased by a multiplicative term, while only one weight vector is updated per learning step. This is of practical interest since its instances have computational complexities among the lowest in existing CL models. In addition, in applications like classification, vector quantizer design and probability density function estimation, a necessary condition for optimal performance is to avoid neuron underutilization. Hence, it is possible to define instances of MBCL to achieve optimal performance in these applications
Keywords :
computational complexity; probability; unsupervised learning; classification; computational complexities; multiplicatively biased competitive learning model; necessary condition; neuron underutilization; probability density function estimation; qualitative property; vector quantizer design; weight vector; Clustering algorithms; Computational complexity; Context modeling; Councils; H infinity control; Neurons; Probability density function; Prototypes; Speech recognition; Vector quantization;
fLanguage :
English
Journal_Title :
Neural Networks, IEEE Transactions on
Publisher :
ieee
ISSN :
1045-9227
Type :
jour
DOI :
10.1109/72.728374
Filename :
728374
Link To Document :
بازگشت