DocumentCode :
3497595
Title :
Proving the efficacy of complementary inputs for multilayer neural networks
Author :
Andersen, Timothy L.
Author_Institution :
Comput. Sci. Dept., Boise State Univ., Boise, ID, USA
fYear :
2011
fDate :
July 31 2011-Aug. 5 2011
Firstpage :
2062
Lastpage :
2066
Abstract :
This paper proposes and discusses a backpropagation-based training approach for multilayer networks that counteracts the tendency that typical backpropagation-based training algorithms have to “favor” examples that have large input feature values. This problem can occur in any real valued input space, and can create a surprising degree of skew in the learned decision surface even with relatively simple training sets. The proposed method involves modifying the original input feature vectors in the training set by appending complementary inputs, which essentially doubles the number of inputs to the network. This paper proves that this modification does not increase the network complexity, by showing that it is possible to map the network with complimentary inputs back into the original feature space.
Keywords :
backpropagation; multilayer perceptrons; backpropagation-based training approach; complementary inputs; decision surface; input feature vectors; multilayer neural networks; network complexity; training sets; Accuracy; Backpropagation; Complexity theory; Encoding; Equations; Surface treatment; Training;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks (IJCNN), The 2011 International Joint Conference on
Conference_Location :
San Jose, CA
ISSN :
2161-4393
Print_ISBN :
978-1-4244-9635-8
Type :
conf
DOI :
10.1109/IJCNN.2011.6033480
Filename :
6033480
Link To Document :
بازگشت