• DocumentCode
    3147013
  • Title

    Back-propagation as the solution of differential-algebraic equations for artificial neural network training

  • Author

    Sanchez-Gasca, J.J. ; Klapper, D.B. ; Yoshizawa, J.

  • Author_Institution
    GE Industrial & Power Systems, Power Syst. Eng. Dept., Schenectady, NY, USA
  • fYear
    1991
  • fDate
    23-26 Jul 1991
  • Firstpage
    242
  • Lastpage
    244
  • Abstract
    The backpropagation algorithm for neural network training is formulated as the solution of a set of sparse differential algebraic equations (DAE). These equations are then solved as a function of time. The solution of the differential equations is performed using an implicit integrator with adjustable time step. The topology of the Jacobian matrix associated with the DAE´s is illustrated. A training example is included
  • Keywords
    backpropagation; differential equations; matrix algebra; neural nets; AI; Jacobian matrix; algorithm; artificial neural network training; integrator; learning; sparse differential algebraic equations; topology; Artificial neural networks; Differential algebraic equations; Differential equations; Industrial power systems; Industrial training; Jacobian matrices; Minimization methods; Network topology; Neural networks; Nonlinear equations;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks to Power Systems, 1991., Proceedings of the First International Forum on Applications of
  • Conference_Location
    Seattle, WA
  • Print_ISBN
    0-7803-0065-3
  • Type

    conf

  • DOI
    10.1109/ANN.1991.213470
  • Filename
    213470