DocumentCode :
3598743
Title :
An observation concerning a classification problem and back-propagation for the feedforward neural network
Author :
Sakk, Eric ; Belina, John ; Thomas, Robert J.
Author_Institution :
Sch. of Electr. Eng., Cornell Univ., Ithaca, NY, USA
Volume :
3
fYear :
1992
Firstpage :
948
Abstract :
A simple classification problem using a single-layer feedforward neural network in conjunction with the backpropagation training algorithm (BPTA) is examined. It has been observed that, for such a problem, the values of the input weights are closely related to the input training set. An implication of this observation is that, rather than choosing initially random weights for the BPTA, one may choose initial weights that are actually quite close to a global minimum in the BP error function. An advantage of such a choice would be faster convergence times based on knowledge of the incoming training data
Keywords :
backpropagation; feedforward neural nets; pattern recognition; backpropagation; classification problem; error function; feedforward neural network; initially random weights; input weights; training algorithm; Acceleration; Algorithm design and analysis; Backpropagation algorithms; Convergence; Electrocardiography; Erbium; Feedforward neural networks; Neural networks; Neurons; Training data;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1992. IJCNN., International Joint Conference on
Print_ISBN :
0-7803-0559-0
Type :
conf
DOI :
10.1109/IJCNN.1992.227077
Filename :
227077
Link To Document :
بازگشت