DocumentCode :
3495637
Title :
A fast learning Fully Complex-valued Relaxation Network (FCRN)
Author :
Suresh, S. ; Savitha, R. ; Sundararajan, N.
Author_Institution :
Sch. of Comput. Eng., Nanyang Technol. Univ., Singapore, Singapore
fYear :
2011
fDate :
July 31 2011-Aug. 5 2011
Firstpage :
1372
Lastpage :
1377
Abstract :
This paper presents a fast learning algorithm for a single hidden layer complex-valued neural network named as the “Fully Complex-valued Relaxation Network (FCRN)”. FCRN employs a fully complex-valued Gaussian like activation function (sech) in the hidden layer and an exponential activation function in the output layer. FCRN estimates the minimum energy state of a logarithmic error function which represents both the magnitude and phase errors explicitly to compute the optimum output weights for randomly chosen hidden layer parameters. As the weights are computed by the inversion of a nonsingular matrix, FCRN requires lesser computational effort during training. Performance studies using a synthetic function approximation problem and a QAM equalization problem show improved approximation ability of the proposed FCRN network.
Keywords :
function approximation; learning (artificial intelligence); matrix inversion; neural nets; QAM equalization problem; exponential activation function; fast learning algorithm; fully complex-valued Gaussian like activation function; fully complex-valued relaxation network; hidden layer complex-valued neural network; logarithmic error function; nonsingular matrix inversion; synthetic function approximation problem; Approximation algorithms; Energy states; Function approximation; Neurons; Quadrature amplitude modulation; Training;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks (IJCNN), The 2011 International Joint Conference on
Conference_Location :
San Jose, CA
ISSN :
2161-4393
Print_ISBN :
978-1-4244-9635-8
Type :
conf
DOI :
10.1109/IJCNN.2011.6033384
Filename :
6033384
Link To Document :
بازگشت