DocumentCode :
1446156
Title :
Hybrid Training Method for MLP: Optimization of Architecture and Training
Author :
Zanchettin, Cleber ; Ludermir, Teresa B. ; Almeida, Leandro Maciel
Author_Institution :
Center of Inf., Fed. Univ. of Pernambuco, Recife, Brazil
Volume :
41
Issue :
4
fYear :
2011
Firstpage :
1097
Lastpage :
1109
Abstract :
The performance of an artificial neural network (ANN) depends upon the selection of proper connection weights, network architecture, and cost function during network training. This paper presents a hybrid approach (GaTSa) to optimize the performance of the ANN in terms of architecture and weights. GaTSa is an extension of a previous method (TSa) proposed by the authors. GaTSa is based on the integration of the heuristic simulated annealing (SA), tabu search (TS), genetic algorithms (GA), and backpropagation, whereas TSa does not use GA. The main advantages of GaTSa are the following: a constructive process to add new nodes in the architecture based on GA, the ability to escape from local minima with uphill moves (SA feature), and faster convergence by the evaluation of a set of solutions (TS feature). The performance of GaTSa is investigated through an empirical evaluation of 11 public-domain data sets using different cost functions in the simultaneous optimization of the multilayer perceptron ANN architecture and weights. Experiments demonstrated that GaTSa can also be used for relevant feature selection. GaTSa presented statistically relevant results in comparison with other global and local optimization techniques.
Keywords :
genetic algorithms; neural nets; search problems; ANN; GA; GaTSa; MLP; SA; TS; artificial neural network; cost function; cost functions; genetic algorithms; hybrid training method; multilayer perceptron; network architecture; proper connection weights; public domain data sets; simulated annealing; tabu search; Algorithm design and analysis; Artificial neural networks; Biological cells; Cost function; Genetic algorithms; Training; Genetic algorithms (GAs); multilayer perceptron (MLP); optimization; simulating annealing; tabu search (TS);
fLanguage :
English
Journal_Title :
Systems, Man, and Cybernetics, Part B: Cybernetics, IEEE Transactions on
Publisher :
ieee
ISSN :
1083-4419
Type :
jour
DOI :
10.1109/TSMCB.2011.2107035
Filename :
5710589
Link To Document :
بازگشت