Title :
Evaluation and improvement of two training algorithms
Author :
Kim, Tae-Hoon ; Li, Jiang ; Manry, Michael T.
Author_Institution :
Dept. of Electr. Eng., Texas Univ., Arlington, TX, USA
Abstract :
Two effective neural network training algorithms are output weight optimization - hidden weight optimization and conjugate gradient. The former performs better on correlated data, and the latter performs better on random data. Based on these observations and others, we develop a procedure to test general neural network training algorithms. Since good neural network algorithm should perform well for all kinds of data, we develop alternation algorithms, which have runs of the different algorithms in turn. The alternation algorithm works well for both kinds of data.
Keywords :
conjugate gradient methods; correlation methods; learning (artificial intelligence); multilayer perceptrons; optimisation; random processes; CG algorithm; MLP training algorithms; alternation algorithms; conjugate gradient; conjugate gradient algorithm; correlated data; hidden weight optimization; multilayer perceptron; neural network training algorithms; output weight optimization; random data; Artificial neural networks; Character generation; Multilayer perceptrons; Neural networks; Nonhomogeneous media; Random number generation; Testing; Training data; Upper bound; Vectors;
Conference_Titel :
Signals, Systems and Computers, 2002. Conference Record of the Thirty-Sixth Asilomar Conference on
Conference_Location :
Pacific Grove, CA, USA
Print_ISBN :
0-7803-7576-9
DOI :
10.1109/ACSSC.2002.1196938