• DocumentCode
    309302
  • Title

    A new method in neural network supervised training with imprecision

  • Author

    Magoulas, G.D. ; Vrahatis, M.N. ; Androulakis, G.S.

  • Author_Institution
    Dept. of Electr. & Comput. Eng., Patras Univ., Greece
  • Volume
    1
  • fYear
    1996
  • fDate
    13-16 Oct 1996
  • Firstpage
    287
  • Abstract
    We propose a method that proceeds solely with the minimal information of the error function and gradient which is their algebraic signs and takes minimization steps in each weight direction. This approach seems to be practically useful especially when training is affected by technology imperfections and environmental changes that cause unpredictable deviations of parameter values from the designed configuration. Therefore, it may be difficult or impossible to obtain very precise values for the error function and the gradient of error during training
  • Keywords
    learning (artificial intelligence); neural nets; algebraic sign; error function; error gradient; imprecision; minimization; neural network; supervised training; Computer errors; Ear; Feedforward neural networks; Feeds; Intelligent networks; Mathematics; Minimization methods; Neural networks; Neurons; Numerical simulation;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Electronics, Circuits, and Systems, 1996. ICECS '96., Proceedings of the Third IEEE International Conference on
  • Conference_Location
    Rodos
  • Print_ISBN
    0-7803-3650-X
  • Type

    conf

  • DOI
    10.1109/ICECS.1996.582805
  • Filename
    582805