Title :
Enhanced robustness of multilayer perceptron training
Author :
Delashmit, Walter H. ; Manry, Michael T.
Author_Institution :
Lockheed Martin Missiles & Fire Control, Dallas, TX, USA
Abstract :
Due to the chaotic nature of multilayer perceptron training, training error usually fails to be a monotonically non-increasing function of the number of hidden units. An initialization and training methodology is developed to significantly increase the probability that the training error is monotonically non-increasing. First a structured initialization generates the random weights in a particular order. Second, larger networks are initialized using weights from smaller trained networks. Lastly, the required number of iterations is calculated as a function of network size.
Keywords :
error analysis; learning (artificial intelligence); multilayer perceptrons; enhanced robustness; hidden units; initialization methodology; multilayer perceptron training; network size; random weights generation; structured initialization; trained networks; training error; training methodology; Chaos; Chebyshev approximation; Error correction; Fires; Mean square error methods; Missiles; Multilayer perceptrons; Process control; Reactive power; Robustness;
Conference_Titel :
Signals, Systems and Computers, 2002. Conference Record of the Thirty-Sixth Asilomar Conference on
Conference_Location :
Pacific Grove, CA, USA
Print_ISBN :
0-7803-7576-9
DOI :
10.1109/ACSSC.2002.1196940