DocumentCode
409673
Title
New training algorithms for dependency initialized multilayer perceptrons
Author
Delashmit, W.H. ; Manry, Michael T.
Author_Institution
Lockheed Martin, Dallas, TX, USA
Volume
1
fYear
2003
fDate
9-12 Nov. 2003
Firstpage
581
Abstract
Due to the chaotic nature of multilayer perceptron training, training error usually fails to be a monotonically nonincreasing function of the number of hidden units. New training algorithms are developed where weights and thresholds from a well-trained smaller network are used to initialize a larger network. Methods are also developed to reduce the total amount of training required. It is shown that this technique yields an error curve that is a monotonic nonincreasing function of the number of hidden units and significantly reduces the training complexity. Additional results are presented based on using different probability distributions to generate the initial weights.
Keywords
learning (artificial intelligence); multilayer perceptrons; probability; chaotic nature; dependency initialized multilayer perceptron; probability distribution; training algorithm; training error; Chaos; Complex networks; Error correction; Fires; Mean square error methods; Missiles; Multilayer perceptrons; Probability distribution; Samarium; Training data;
fLanguage
English
Publisher
ieee
Conference_Titel
Signals, Systems and Computers, 2004. Conference Record of the Thirty-Seventh Asilomar Conference on
Print_ISBN
0-7803-8104-1
Type
conf
DOI
10.1109/ACSSC.2003.1291977
Filename
1291977
Link To Document