Title :
Constructive proof of efficient pattern storage in the multi-layer perceptron
Author :
Gopalakrishnan, Arunachalam ; Jiang, Xiangping ; Chen, Mu-Song ; Manry, Michael T.
Author_Institution :
Dept. of Electr. Eng., Texas Univ., Arlington, TX, USA
Abstract :
We show that the pattern storage capability of the Gabor polynomial is much higher than the commonly used lower bound on multi-layer perceptron (MLP) pattern storage. We also show that multi-layer perceptron networks having second and third degree polynomial activations can be constructed which efficiently implement Gabor polynomials and therefore have the same high pattern storage capability. The polynomial networks can be mapped to conventional sigmoidal MLPs having the same efficiency. It is shown that training techniques like output weight optimization and conjugate gradient attain only the lower bound of pattern storage. Certainly they are not the final solutions to the MLP training problem
Keywords :
approximation theory; feedforward neural nets; learning (artificial intelligence); multilayer perceptrons; polynomials; Gabor polynomial; MLP training problem; conjugate gradient; lower bound; multi-layer perceptron; multi-layer perceptron networks; output weight optimization; pattern storage; second degree polynomial activation; sigmoidal MLP; third degree polynomial activation; training techniques; Differential equations; Joining processes; Multilayer perceptrons; Neural networks; Polynomials; Terminology; Upper bound;
Conference_Titel :
Signals, Systems and Computers, 1993. 1993 Conference Record of The Twenty-Seventh Asilomar Conference on
Conference_Location :
Pacific Grove, CA
Print_ISBN :
0-8186-4120-7
DOI :
10.1109/ACSSC.1993.342540