• DocumentCode
    423536
  • Title

    Context discerning multifunction networks: reformulating fixed weight neural networks

  • Author

    Santiago, Roberto A.

  • Author_Institution
    NW Comput. Intelligence Lab., Portland State Univ., OR, USA
  • Volume
    1
  • fYear
    2004
  • fDate
    25-29 July 2004
  • Lastpage
    194
  • Abstract
    Research in recurrent neural networks has produced a genre of networks referred to as fixed weight neural networks (FWNNs) which have the ability to adapt without changing explicit weights. FWNNs are unique in that they adapt their processing based on the spatiotemporal characteristics of the incoming signal without need for weight change. As a result, a single FWNN is able to model and control many families of disparate systems without weight changes. FWNNs pose an interesting model for contextual memory in neural systems. The work reported takes a FWNN, decomposes it and analyzes its internal workings. Using new insight, FWNNs are reformulated into a simpler structure, context discerning multifunction networks (CDMN).
  • Keywords
    learning (artificial intelligence); recurrent neural nets; context discerning multifunction network; contextual memory; fixed weight neural network reformulation; neural systems; recurrent neural networks; spatiotemporal characteristics; Computational intelligence; Context modeling; Electronic mail; Feedforward neural networks; Machine learning; Neural networks; Recurrent neural networks; Signal processing; Spatiotemporal phenomena;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 2004. Proceedings. 2004 IEEE International Joint Conference on
  • ISSN
    1098-7576
  • Print_ISBN
    0-7803-8359-1
  • Type

    conf

  • DOI
    10.1109/IJCNN.2004.1379896
  • Filename
    1379896