DocumentCode :
2695593
Title :
Single layer potential function neural network for unsupervised learning
Author :
Dajani, A.L. ; Kamel, M. ; Elmastry, M.I.
fYear :
1990
fDate :
17-21 June 1990
Firstpage :
273
Abstract :
A variant of the generalized delta rule training algorithm is presented for unsupervized learning in a single-layer network using Gaussian function nonlinearities. An alternative to direct competition between the output nodes that allows linear instead of quadratic complexity in the connections of the output nodes is proposed. The connecting weights are adaptively modified with each presentation of an input pattern and converge towards values that are representative of the clustering structure of the input data. In order to calculate the weight increment, the training algorithm uses locally available information at the synapse and its connecting neuron. The training algorithm, coupled with the alternative competition approach, eliminates the need for an orienting subsystem such as that needed in the ART1 model of G.A. Carpenter and S. Grossberg (1988)
Keywords :
computational complexity; learning systems; neural nets; parallel algorithms; ART1 model; Gaussian function nonlinearities; alternative competition approach; connecting weights; generalized delta rule training algorithm; locally available information; output nodes; single-layer network; synapse; training algorithm; unsupervized learning; weight increment;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1990., 1990 IJCNN International Joint Conference on
Conference_Location :
San Diego, CA, USA
Type :
conf
DOI :
10.1109/IJCNN.1990.137726
Filename :
5726685
Link To Document :
بازگشت