DocumentCode :
2618151
Title :
Hard-limiting nonlinear functions in artificial neural networks
Author :
Lai, W.K. ; Coghill, G.G.
Author_Institution :
Dept. of Electr. & Electron. Eng., Auckland Univ., New Zealand
fYear :
1991
fDate :
18-21 Nov 1991
Firstpage :
1747
Abstract :
The authors show how the optimum hard-limiter can be found. They also show what the optimum operating point of this type of nonlinear function should be, by illustrating the performance of this optimum hard-limiter when it is used with a simple neural network in content-addressable memories. It is demonstrated that there is a narrow band of values for the normal operation of the hard-limiting function, beyond which the network would not be able to accurately recall any of the stored patterns. Mathematical analysis of the theoretical bounds of this parameter showed that this band will narrow if one expects the network to work with noisier data. The network is expected to suffer no deterioration in the quality of recall with small deviations in the threshold when the noise ratio in the test patterns is low. However, the margin of safe operation will narrow when the noise ratio of the test patterns is high. Other types of nonlinear functions with offsets have been shown to improve the performance of this type of neural network in accurately recovering the original patterns
Keywords :
content-addressable storage; neural nets; artificial neural networks; content-addressable memories; nonlinear function; optimum hard-limiter; Artificial neural networks; Associative memory; Delay; Hopfield neural networks; Intelligent networks; Neural networks; Neurons; Pattern classification; Symmetric matrices; Tin;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1991. 1991 IEEE International Joint Conference on
Print_ISBN :
0-7803-0227-3
Type :
conf
DOI :
10.1109/IJCNN.1991.170368
Filename :
170368
Link To Document :
بازگشت