• DocumentCode
    1757781
  • Title

    Learning Topology and Dynamics of Large Recurrent Neural Networks

  • Author

    Yiyuan She ; Yuejia He ; Dapeng Wu

  • Author_Institution
    Dept. of Stat., Florida State Univ., Tallahassee, FL, USA
  • Volume
    62
  • Issue
    22
  • fYear
    2014
  • fDate
    Nov.15, 2014
  • Firstpage
    5881
  • Lastpage
    5891
  • Abstract
    Large-scale recurrent networks have drawn increasing attention recently because of their capabilities in modeling a large variety of real-world phenomena and physical mechanisms. This paper studies how to identify all authentic connections and estimate system parameters of a recurrent network, given a sequence of node observations. This task becomes extremely challenging in modern network applications, because the available observations are usually very noisy and limited, and the associated dynamical system is strongly nonlinear. By formulating the problem as multivariate sparse sigmoidal regression, we develop simple-to-implement network learning algorithms, with rigorous convergence guarantee in theory, for a variety of sparsity-promoting penalty forms. A quantile variant of progressive recurrent network screening is proposed for efficient computation and allows for direct cardinality control of network topology in estimation. Moreover, we investigate recurrent network stability conditions in Lyapunov´s sense, and integrate such stability constraints into sparse network learning. Experiments show excellent performance of the proposed algorithms in network topology identification and forecasting.
  • Keywords
    Lyapunov methods; learning systems; neurocontrollers; nonlinear dynamical systems; parameter estimation; recurrent neural nets; regression analysis; stability; Lyapunov sense; associated nonlinear dynamical system; convergence guarantee; direct cardinality control; large recurrent neural networks; learning dynamics; learning topology; multivariate sparse sigmoidal regression; network topology identification; node observation sequence; physical mechanisms; progressive recurrent network screening; quantile variant; simple-to-implement network learning algorithms; sparse network learning; sparsity-promoting penalty forms; stability constraints; system parameter estimation; Estimation; Mathematical model; Network topology; Signal processing algorithms; Stability analysis; Terrorism; Topology; Dynamical systems; Lyapunov stability; recurrent networks; shrinkage estimation; topology learning; variable selection;
  • fLanguage
    English
  • Journal_Title
    Signal Processing, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    1053-587X
  • Type

    jour

  • DOI
    10.1109/TSP.2014.2358956
  • Filename
    6914572