DocumentCode :
2662933
Title :
Non-standard norms in genetically trained neural networks
Author :
Morales, Angel Kuri
Author_Institution :
Centro de Investigacion en Comput., Inst. Politecnico Nacional, Mexico City, Mexico
fYear :
2000
fDate :
2000
Firstpage :
43
Lastpage :
51
Abstract :
We discuss alternative norms to train neural networks (NNs). We focus on the so called multilayer perceptrons (MLPs). To achieve this we rely on a genetic algorithm called an eclectic GA (EGA). By using the EGA we avoid the drawbacks of the standard training algorithm in this sort of NN: the backpropagation algorithm. We define four measures of distance: the mean exponential error (MEE); the mean absolute error (MAE); the maximum square error (MSE); and the maximum (supremum) absolute error (SAE). We analyze the behavior of an MLP NN on two kinds of problems: classification and forecasting. We discuss the results of applying an EGA to train the NNs and show that alternative norms yield better results than the traditional RMS norm. We also discuss the resilience of genetically trained NNs to the change of the transfer function in the output layer
Keywords :
genetic algorithms; learning (artificial intelligence); multilayer perceptrons; pattern classification; backpropagation; classification; eclectic genetic algorithm; forecasting; genetically trained neural networks; maximum absolute error; maximum square error; mean absolute error; mean exponential error; multilayer perceptrons; nonstandard norms; transfer function; Backpropagation algorithms; Convergence; Economic forecasting; Genetic algorithms; Intelligent networks; Multilayer perceptrons; Neural networks; Neurons; Resilience; Transfer functions;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Combinations of Evolutionary Computation and Neural Networks, 2000 IEEE Symposium on
Conference_Location :
San Antonio, TX
Print_ISBN :
0-7803-6572-0
Type :
conf
DOI :
10.1109/ECNN.2000.886218
Filename :
886218
Link To Document :
بازگشت