• DocumentCode
    1941663
  • Title

    Quantifying the Effect of Learning on Recurrent Spikin Neurons

  • Author

    Brodu, Nicolas

  • Author_Institution
    Dept. of Comput. Sci. & Software Eng., Concordia Univ., Montreal, QC
  • fYear
    2007
  • fDate
    12-17 Aug. 2007
  • Firstpage
    512
  • Lastpage
    517
  • Abstract
    This work is concerned with measuring what is the response of recurrent spiking neurons when a learning rule is applied to them, in a liquid state machine context. Two indicators are considered for monitoring on-line the effect of learning: the separation property, which has already been studied in previous works, and an incremental version of the statistical complexity measure that is introduced expressly for our needs. It is found that while separation increases, a neuron´s average statistical complexity decreases when a learning rule is applied. This means that neurons become more predictable and their behavior is simplified as an effect of learning. A key feature of this work is to provide a quantification of this phenomenon.
  • Keywords
    learning (artificial intelligence); recurrent neural nets; statistical analysis; incremental version; liquid state machine context; machine learning; recurrent spiking neuron; statistical complexity measure; Neural networks; Neurons;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 2007. IJCNN 2007. International Joint Conference on
  • Conference_Location
    Orlando, FL
  • ISSN
    1098-7576
  • Print_ISBN
    978-1-4244-1379-9
  • Electronic_ISBN
    1098-7576
  • Type

    conf

  • DOI
    10.1109/IJCNN.2007.4371009
  • Filename
    4371009