• DocumentCode
    65005
  • Title

    Non-Divergence of Stochastic Discrete Time Algorithms for PCA Neural Networks

  • Author

    Jian Cheng Lv ; Zhang Yi ; Yunxia Li

  • Author_Institution
    Machine Intell. Lab., Sichuan Univ., Chengdu, China
  • Volume
    26
  • Issue
    2
  • fYear
    2015
  • fDate
    Feb. 2015
  • Firstpage
    394
  • Lastpage
    399
  • Abstract
    Learning algorithms play an important role in the practical application of neural networks based on principal component analysis, often determining the success, or otherwise, of these applications. These algorithms cannot be divergent, but it is very difficult to directly study their convergence properties, because they are described by stochastic discrete time (SDT) algorithms. This brief analyzes the original SDT algorithms directly, and derives some invariant sets that guarantee the nondivergence of these algorithms in a stochastic environment by selecting proper learning parameters. Our theoretical results are verified by a series of simulation examples.
  • Keywords
    discrete time systems; learning (artificial intelligence); neural nets; principal component analysis; stochastic processes; PCA neural networks; SDT algorithm; learning algorithm; principal component analysis; stochastic discrete time algorithm nondivergence; Algorithm design and analysis; Approximation algorithms; Convergence; Heuristic algorithms; Neural networks; Principal component analysis; Signal processing algorithms; Neural networks; nondivergence; principal component analysis (PCA); stochastic discrete time (SDT) method; stochastic discrete time (SDT) method.;
  • fLanguage
    English
  • Journal_Title
    Neural Networks and Learning Systems, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    2162-237X
  • Type

    jour

  • DOI
    10.1109/TNNLS.2014.2312421
  • Filename
    6783730