DocumentCode :
262054
Title :
Enhanced Gradient Descent Algorithms for Complex-Valued Neural Networks
Author :
Popa, Calin-Adrian
Author_Institution :
Dept. of Comput. & Software Eng., Polytech. Univ. Timisoara, Timisoara, Romania
fYear :
2014
fDate :
22-25 Sept. 2014
Firstpage :
272
Lastpage :
279
Abstract :
In this paper, enhanced gradient descent learning algorithms for complex-valued feed forward neural networks are proposed. The most known such enhanced algorithms for real-valued neural networks are: quick prop, resilient back propagation, delta-bar-delta, and Super SAB, and so it is natural to extend these learning methods to complex-valued neural networks, also. The complex variants of these four algorithms are presented, which are then exemplified on various function approximation problems, as well as on channel equalization and time series prediction applications. Experimental results show an important improvement in training and testing error over classical gradient descent and gradient descent with momentum algorithms.
Keywords :
backpropagation; feedforward neural nets; function approximation; gradient methods; time series; SuperSAB; channel equalization; complex-valued feedforward neural networks; delta-bar-delta; enhanced gradient descent learning proposed; function approximation problems; quickprop; real-valued neural networks; resilient backpropagation; testing error; time series prediction applications; training error; Approximation algorithms; Biological neural networks; Heuristic algorithms; Neurons; Signal processing algorithms; Testing; Training; Channel equalization; Complex-valued neural networks; Delta-bar-delta; Quickprop; Resilient backpropagation; Super SAB; Time series prediction;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Symbolic and Numeric Algorithms for Scientific Computing (SYNASC), 2014 16th International Symposium on
Conference_Location :
Timisoara
Print_ISBN :
978-1-4799-8447-3
Type :
conf
DOI :
10.1109/SYNASC.2014.44
Filename :
7034694
Link To Document :
بازگشت