Title :
Use of Backpropagation and Differential Evolution Algorithms to Training MLPs
Author :
Camargo, Luiz Carlos ; Correa Tissot, Hegler ; Ramirez Pozo, Aurora Trinidad
Author_Institution :
Dept. de Inf. (INF), Univ. Fed. do Parana (UFPR), Curitiba, Brazil
Abstract :
Artificial Neural Networks (ANNs) are often used (trained) to find a general solution in problems where a pattern needs to be extracted, such as data classification. Feedforward (FFNN) is one of the ANN architectures and multilayer perceptron (MLP) is a type of FFNN. Based on gradient descent, backpropagation (BP) is one of the most used algorithms for MLP training. Evolutionary algorithms can be also used to train MLPs, including Differential Evolution (DE) algorithm. In this paper, BP and DE are used to train MLPs and they are both compared in four different approaches: (a) backpropagation, (b) DE with fixed parameter values, (c) DE with adaptive parameter values and (d) a hybrid alternative using both DE+BP algorithms.
Keywords :
backpropagation; evolutionary computation; gradient methods; multilayer perceptrons; neural net architecture; ANN architectures; ANNs; DE algorithm; FFNN; MLP training; artificial neural networks; backpropagation; data classification; differential evolution algorithms; feedforward; gradient descent; multilayer perceptron; pattern extraction; Artificial neural networks; Backpropagation; Databases; Sociology; Statistics; Training; Vectors; Artificial Neural Network; Backpropagation (BP) algorithm; Differential Evolution (DE) algorithm; Multilayer Perceptron;
Conference_Titel :
Chilean Computer Science Society (SCCC), 2012 31st International Conference of the
Conference_Location :
Valparaiso
Print_ISBN :
978-1-4799-2937-5
DOI :
10.1109/SCCC.2012.17