• DocumentCode
    349205
  • Title

    Undersampling for the training of feedback neural networks on large sequences; application to the modeling of an induction machine

  • Author

    Constant, L. ; Dagues, B. ; Rivals, I. ; Personnaz, L.

  • Author_Institution
    Lab. d´´Electrotech. et d´´Electron. Ind., CNRS, Toulouse, France
  • Volume
    2
  • fYear
    1999
  • fDate
    5-8 Sep 1999
  • Firstpage
    1025
  • Abstract
    This paper proposes an economic method for the nonlinear modeling of dynamic processes using feedback neural networks, by undersampling the training sequences. The undersampling (i) allows a better exploration of the operating range of the process for a given size of the training sequences, and (ii) it speeds up the training of the feedback networks. This method is successfully applied to the training of a neural model of the electromagnetic part of an induction machine, whose sampling period must be small enough to take fast variations of the input voltage into account, i.e, smaller than 1 μs
  • Keywords
    asynchronous machines; electric machine analysis computing; learning (artificial intelligence); machine theory; recurrent neural nets; dynamic processes; feedback neural networks; induction machine; input voltage variations; neural model; nonlinear modeling; operating range; sampling period; training; undersampling; Electromagnetic modeling; Electronic mail; Induction machines; Industrial economics; Industrial training; Neural networks; Neurofeedback; Sampling methods; Stators; Voltage;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Electronics, Circuits and Systems, 1999. Proceedings of ICECS '99. The 6th IEEE International Conference on
  • Conference_Location
    Pafos
  • Print_ISBN
    0-7803-5682-9
  • Type

    conf

  • DOI
    10.1109/ICECS.1999.813408
  • Filename
    813408