DocumentCode :
2871131
Title :
A new method to construct FNN with linear output neurons
Author :
Xiangdong, Wang ; Yongmei, Chen ; Linchu, Shi
Author_Institution :
Artificial Neural Network Group, Beijing, China
Volume :
2
fYear :
1998
fDate :
1998
Firstpage :
1319
Abstract :
In this paper, a new network-growing method for a multilayer feedforward neural network (FNN) is proposed. It has the following distinctive features: 1) The network starts training with a small network and gradually grows its hidden neurons. 2) The activation function of its output neurons is a linear function. Moreover, its application in pattern recognition is also discussed. Simulation results show that the new algorithm achieves a higher recognition rate and converges faster than the conventional backpropagation algorithm and it can avoid the trap of local minima through increasing the hidden neurons
Keywords :
convergence; feedforward neural nets; learning (artificial intelligence); pattern recognition; activation function; convergence; feedforward neural network; hidden neurons; linear function; linear output neurons; local minimum; network-growing method; output neurons; pattern recognition; recognition rate; supervised training; Artificial neural networks; Backpropagation algorithms; Feedforward neural networks; Gradient methods; Multi-layer neural network; Network topology; Neural networks; Neurons; Pattern recognition; Vectors;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Signal Processing Proceedings, 1998. ICSP '98. 1998 Fourth International Conference on
Conference_Location :
Beijing
Print_ISBN :
0-7803-4325-5
Type :
conf
DOI :
10.1109/ICOSP.1998.770862
Filename :
770862
Link To Document :
بازگشت