DocumentCode :
2663275
Title :
Specifying intrinsically adaptive architectures
Author :
Lucas, Simon
Author_Institution :
Dept. of Electron. Syst. Eng., Essex Univ., Colchester, UK
fYear :
2000
fDate :
2000
Firstpage :
224
Lastpage :
231
Abstract :
The paper describes a method for specifying (and evolving) intrinsically adaptive neural architectures. These architectures have back-propagation style gradient descent behavior built into them at a cellular level. The significance of this is that we can now use back-propagation to train evolved feedforward networks of any structure (provided that individual nodes are differentiable). Networks evolved in this way can potentially adapt to their environment in situ. This is in contrast to more conventional techniques such as using a genetic algorithm or simulated annealing to train the network. The method can be seamlessly integrated with any method for evolving neural network architectures. The performance of the method is investigated on the simple synthetic benchmarks of parity and intertwined spiral problems
Keywords :
adaptive systems; backpropagation; evolutionary computation; feedforward neural nets; formal specification; neural net architecture; back-propagation style gradient descent behavior; cellular level; feedforward networks; intertwined spiral problems; intrinsically adaptive architecture specification; intrinsically adaptive neural architectures; neural network architecture evolution; parity problems; synthetic benchmarks; Feedforward neural networks; Feedforward systems; Function approximation; Genetic programming; Modeling; Neural networks; Phase measurement; Simulated annealing; Spirals; Systems engineering and theory;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Combinations of Evolutionary Computation and Neural Networks, 2000 IEEE Symposium on
Conference_Location :
San Antonio, TX
Print_ISBN :
0-7803-6572-0
Type :
conf
DOI :
10.1109/ECNN.2000.886238
Filename :
886238
Link To Document :
بازگشت