DocumentCode :
3752591
Title :
Moth-flame optimization for training Multi-Layer Perceptrons
Author :
Waleed Yamany;Mohammed Fawzy;Alaa Tharwat;Aboul Ella Hassanien
Author_Institution :
Fayoum University, Faculty of Computers and Information, Egypt
fYear :
2015
Firstpage :
267
Lastpage :
272
Abstract :
Multi-Layer Perceptron (MLP) is one of the Feed-Forward Neural Networks (FFNNs) types. Searching for weights and biases in MLP is important to achieve minimum training error. In this paper, Moth-Flame Optimizer (MFO) is used to train Multi-Layer Perceptron (MLP). MFO-MLP is used to search for the weights and biases of the MLP to achieve minimum error and high classification rate. Five standard classification datasets are utilized to evaluate the performance of the proposed method. Moreover, three function-approximation datasets are used to test the performance of the proposed method. The proposed method (i.e. MFO-MLP) is compared with four well-known optimization algorithms, namely, Genetic Algorithm (GA), Particle Swarm Optimization (PSO), Ant Colony Optimization (ACO), and Evolution Strategy (ES). The experimental results prove that the MFO algorithm is very competitive, solves the local optima problem, and it achieves a high accuracy.
Keywords :
"Artificial neural networks","Iris","Heart","Manganese","Optimization","Topology","Breast cancer"
Publisher :
ieee
Conference_Titel :
Computer Engineering Conference (ICENCO), 2015 11th International
Type :
conf
DOI :
10.1109/ICENCO.2015.7416360
Filename :
7416360
Link To Document :
بازگشت