Title :
Probabilistic neural networks for classification, mapping, or associative memory
Author :
Specht, Donald F.
Author_Institution :
Lockheed Palo Alto Res. Lab., CA, USA
Abstract :
It can be shown that by replacing the sigmoid activation function often used in neural networks with an exponential function, a neural network can be formed which computes nonlinear decision boundaries. This technique yields decision surfaces which approach the Bayes optimal under certain conditions. There is a continuous control of the linearity of the decision boundaries, from linear for small training sets to any degree of nonlinearity justified by larger training sets. A four-layer neural network of the type proposed can map any input pattern to any number of classifications. The input variables can be either continuous or binary. Modification of the decision boundaries based on new data can be accomplished in real time simply by defining a set of weights equal to the new training vector. The decision boundaries can be implemented using analog ´neurons´, which operate entirely in parallel. The organization proposed takes into account the projected pin limitations of neural-net chips of the near future. By a change in architecture, these same components could be used as associative memories, to compute nonlinear multivariate regression surfaces, or to compute a posteriori probabilities of an event.<>
Keywords :
Bayes methods; artificial intelligence; content-addressable storage; neural nets; pattern recognition; Bayes optimal; architecture; artificial intelligence; associative memory; classification; exponential function; four-layer neural network; mapping; nonlinear decision boundaries; probabilistic neural networks; training sets; Artificial intelligence; Associative memories; Bayes procedures; Neural networks; Pattern recognition;
Conference_Titel :
Neural Networks, 1988., IEEE International Conference on
Conference_Location :
San Diego, CA, USA
DOI :
10.1109/ICNN.1988.23887