Title :
Orthogonal transformation of output principal components for improved tolerance to error
Author :
Mann, T.P. ; Eggen, C. ; Fox, W. ; Krout, D. ; Anderson, Goran ; El Sharkawi, M.A. ; Marks, R.J., II
Author_Institution :
Dept. of Electr. Eng., Washington Univ., Seattle, WA, USA
Abstract :
Preprocessing of data to be learned by a neural network is typically done to improve neural network performance. Output processing is especially important since it directly affects the influence of error in the hidden layers on the error of the neural network output. Principal component analysis is a commonly used preprocessing method that can improve the network performance by reducing the output dimensionality and reducing the number of parameters in a neural network model. Transforming the principal components of the outputs with an orthonormal matrix prior to scaling can further improve network performance.
Keywords :
multilayer perceptrons; principal component analysis; error tolerance; network performance; neural network; orthogonal transformation; orthonormal matrix; output processing; principal component analysis; Dynamic range; Error correction; Laboratories; Multi-layer neural network; Multilayer perceptrons; Neural networks; Physics; Principal component analysis; Redundancy; Vectors;
Conference_Titel :
Neural Networks, 2003. Proceedings of the International Joint Conference on
Print_ISBN :
0-7803-7898-9
DOI :
10.1109/IJCNN.2003.1223881