DocumentCode :
636050
Title :
Optimization of multi-layer artificial neural networks using delta values of hidden layers
Author :
Wagarachchi, N.M. ; Karunananda, A.S.
Author_Institution :
Dept. of Comput. Math., Univ. of Moratuwa, Moratuwa, Sri Lanka
fYear :
2013
fDate :
16-19 April 2013
Firstpage :
80
Lastpage :
86
Abstract :
The number of hidden layers is crucial in multilayer artificial neural networks. In general, generalization power of the solution can be improved by increasing the number of layers. This paper presents a new method to determine the optimal architecture by using a pruning technique. The unimportant neurons are identified by using the delta values of hidden layers. The modified network contains fewer numbers of neurons in network and shows better generalization. Moreover, it has improved the speed relative to the back propagation training. The experiments have been done with number of test problems to verify the effectiveness of new approach.
Keywords :
backpropagation; multilayer perceptrons; optimisation; back propagation training; delta values; generalization power; hidden layers; multilayer artificial neural networks; neurons; optimal architecture; optimization; pruning technique; Algorithm design and analysis; Artificial neural networks; Computer architecture; Correlation; Heuristic algorithms; Neurons; Training; Artificial Neural networks; Delta values; Hidden layers; Hidden neurons; Multilayer;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Computational Intelligence, Cognitive Algorithms, Mind, and Brain (CCMB), 2013 IEEE Symposium on
Conference_Location :
Singapore
Type :
conf
DOI :
10.1109/CCMB.2013.6609169
Filename :
6609169
Link To Document :
بازگشت