• DocumentCode
    396761
  • Title

    Orthogonal transformation of output principal components for improved tolerance to error

  • Author

    Mann, T.P. ; Eggen, C. ; Fox, W. ; Krout, D. ; Anderson, Goran ; El Sharkawi, M.A. ; Marks, R.J., II

  • Author_Institution
    Dept. of Electr. Eng., Washington Univ., Seattle, WA, USA
  • Volume
    2
  • fYear
    2003
  • fDate
    20-24 July 2003
  • Firstpage
    1290
  • Abstract
    Preprocessing of data to be learned by a neural network is typically done to improve neural network performance. Output processing is especially important since it directly affects the influence of error in the hidden layers on the error of the neural network output. Principal component analysis is a commonly used preprocessing method that can improve the network performance by reducing the output dimensionality and reducing the number of parameters in a neural network model. Transforming the principal components of the outputs with an orthonormal matrix prior to scaling can further improve network performance.
  • Keywords
    multilayer perceptrons; principal component analysis; error tolerance; network performance; neural network; orthogonal transformation; orthonormal matrix; output processing; principal component analysis; Dynamic range; Error correction; Laboratories; Multi-layer neural network; Multilayer perceptrons; Neural networks; Physics; Principal component analysis; Redundancy; Vectors;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 2003. Proceedings of the International Joint Conference on
  • ISSN
    1098-7576
  • Print_ISBN
    0-7803-7898-9
  • Type

    conf

  • DOI
    10.1109/IJCNN.2003.1223881
  • Filename
    1223881