• DocumentCode
    799167
  • Title

    Deterministic convergence of an online gradient method for BP neural networks

  • Author

    Wu, Wei ; Feng, Guorui ; Li, Zhengxue ; Xu, Yuesheng

  • Author_Institution
    Appl. Math. Dept., Dalian Univ. of Technol., China
  • Volume
    16
  • Issue
    3
  • fYear
    2005
  • fDate
    5/1/2005 12:00:00 AM
  • Firstpage
    533
  • Lastpage
    540
  • Abstract
    Online gradient methods are widely used for training feedforward neural networks. We prove in this paper a convergence theorem for an online gradient method with variable step size for backward propagation (BP) neural networks with a hidden layer. Unlike most of the convergence results that are of probabilistic and nonmonotone nature, the convergence result that we establish here has a deterministic and monotone nature.
  • Keywords
    backpropagation; convergence; deterministic algorithms; feedforward neural nets; gradient methods; backward propagation neural network; deterministic convergence; feedforward neural network; nonmonotone nature; online gradient method; probabilistic nature; Computer networks; Convergence; Defense industry; Feedforward neural networks; Gradient methods; H infinity control; Helium; Learning systems; Mathematics; Neural networks; Online gradient methods; backward propagation (BP) neural networks; convergence; Algorithms; Computer Simulation; Computer Systems; Neural Networks (Computer); Numerical Analysis, Computer-Assisted; Online Systems; Signal Processing, Computer-Assisted;
  • fLanguage
    English
  • Journal_Title
    Neural Networks, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    1045-9227
  • Type

    jour

  • DOI
    10.1109/TNN.2005.844903
  • Filename
    1427759