Title :
Generational versus steady-state evolution for optimizing neural network learning
Author :
Bullinaria, J.A.
Author_Institution :
School of Computer Science, The University of Birmingham
Abstract :
The use of simulated evolution is now a commonplace technique for optimizing the learning abilities of neural network systems. Neural network details such as architecture, initial weight distributions, gradient descent learning rates, and regularization parameters, have all been successfully evolved to result in improved performance. The author investigates which evolutionary approaches work best in this field. In particular, he compares the traditional generational approach to a more biologically realistic steady-state approach.
Keywords :
gradient methods; learning (artificial intelligence); neural net architecture; optimisation; biologically realistic steady state method; generational approach; gradient descent learning rates; initial weight distributions; neural net architecture; neural network learning; neural network systems; optimization; steady state evolution; Computational modeling; Computer architecture; Computer science; Cost function; Electronic mail; Equations; Evolution (biology); Feedforward systems; Neural networks; Steady-state;
Conference_Titel :
Neural Networks, 2004. Proceedings. 2004 IEEE International Joint Conference on
Print_ISBN :
0-7803-8359-1
DOI :
10.1109/IJCNN.2004.1380984