Title :
A novel multi-type architecture for FANNs
Author :
Ho, K.C. ; Ponnapalli, P.V.S. ; Thomson, M.
Author_Institution :
Dept. of Electr. & Electron. Eng., Manchester Metroplitan Univ., UK
Abstract :
This paper presents a novel multi-type architecture for feedforward artificial neural networks (FANNs) which offers improved speed of convergence, reduced computational complexity and improved generalization ability. The proposed architecture incorporates at least one linear node in the hidden layer. Theoretical analysis is presented to compare the rate of change of weights associated with nonlinear and linear hidden node connections. Simulation results presented demonstrate that the new architecture can significantly improve convergence and also reduce computational time of FANN training with better generalization capability. Such architecture can be extremely useful for on-line training of FANNs
Keywords :
computational complexity; FANN training; FANNs; backpropagation; computational complexity; computational time; convergence; feedforward artificial neural networks; generalization; hidden layer; hidden node connections; multi-type architecture;
Conference_Titel :
Artificial Neural Networks, Fifth International Conference on (Conf. Publ. No. 440)
Conference_Location :
Cambridge
Print_ISBN :
0-85296-690-3
DOI :
10.1049/cp:19970733