DocumentCode :
2381499
Title :
Enhanced conjugate gradient methods for training MLP-networks
Author :
Izzeldin, Huzaifa ; Asirvadam, Vijanth S. ; Saad, Nordin
Author_Institution :
Dept. of Electr. & Electron. Eng., Univ. Teknol. PETRONAS, Bandar Seri Iskandar, Malaysia
fYear :
2010
fDate :
13-14 Dec. 2010
Firstpage :
139
Lastpage :
143
Abstract :
The paper investigates the enhancement in various conjugate gradient training algorithms applied to a multilayer perceptron (MLP) neural network architecture. The paper investigates seven different conjugate gradient algorithms proposed by different researchers from 1952-2005, the classical batch back propagation, full-memory and memory-less BFGS (Broyden, Fletcher, Goldfarb and Shanno) algorithms. These algorithms are tested in predicting fluid height in two different control tank benchmark problems. Simulations results show that Full-Memory BFGS has overall better performance or less prediction error however it has higher memory usage and longer computational time conjugate gradients.
Keywords :
gradient methods; learning (artificial intelligence); multilayer perceptrons; BFGS; Broyden Fletcher Goldfarb and Shanno; MLP; MLP networks; conjugate gradient methods enhancement; fluid height prediction; gradient training algorithms; multilayer perceptron; neural network architecture; tank benchmark problems; Conjugate Gradient; Neural Network; Offline learning;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Research and Development (SCOReD), 2010 IEEE Student Conference on
Conference_Location :
Putrajaya
Print_ISBN :
978-1-4244-8647-2
Type :
conf
DOI :
10.1109/SCORED.2010.5703989
Filename :
5703989
Link To Document :
بازگشت