DocumentCode :
3423812
Title :
Enhanced robustness of multilayer perceptron training
Author :
Delashmit, Walter H. ; Manry, Michael T.
Author_Institution :
Lockheed Martin Missiles & Fire Control, Dallas, TX, USA
Volume :
2
fYear :
2002
fDate :
3-6 Nov. 2002
Firstpage :
1029
Abstract :
Due to the chaotic nature of multilayer perceptron training, training error usually fails to be a monotonically non-increasing function of the number of hidden units. An initialization and training methodology is developed to significantly increase the probability that the training error is monotonically non-increasing. First a structured initialization generates the random weights in a particular order. Second, larger networks are initialized using weights from smaller trained networks. Lastly, the required number of iterations is calculated as a function of network size.
Keywords :
error analysis; learning (artificial intelligence); multilayer perceptrons; enhanced robustness; hidden units; initialization methodology; multilayer perceptron training; network size; random weights generation; structured initialization; trained networks; training error; training methodology; Chaos; Chebyshev approximation; Error correction; Fires; Mean square error methods; Missiles; Multilayer perceptrons; Process control; Reactive power; Robustness;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Signals, Systems and Computers, 2002. Conference Record of the Thirty-Sixth Asilomar Conference on
Conference_Location :
Pacific Grove, CA, USA
ISSN :
1058-6393
Print_ISBN :
0-7803-7576-9
Type :
conf
DOI :
10.1109/ACSSC.2002.1196940
Filename :
1196940
Link To Document :
بازگشت