• DocumentCode
    303222
  • Title

    Cascade error projection: a new learning algorithm

  • Author

    Duong, Tuan A. ; Stubberud, Allen R. ; Daud, Taher ; Thakoor, A.

  • Author_Institution
    Center for Space Microelectron. Technol., California Inst. of Technol., Pasadena, CA, USA
  • Volume
    1
  • fYear
    1996
  • fDate
    3-6 Jun 1996
  • Firstpage
    229
  • Abstract
    Artificial neural networks, with massive parallelism, have been shown to efficiently solve ill-defined problems in pattern recognition, classification, and optimization. It is further recognized that for real time low power operation, their highly parallel connectivity must be harnessed in VLSI silicon. However, such an implementation, specially an analog embodiment, suffers from two drawbacks: limited precision confounded by circuit noise and lack of proper hardware implementable learning algorithm. This paper specifically addresses these issues and offers an architecture along with a new hardware implementable learning algorithm called cascade error projection (CEP). CEP´s methodology is tolerant of limited hardware precision, particularly the synapse resolution. A concise mathematical analysis is presented. The frame work can be used to obtain cascade correlation (CC) learning algorithm as a special case of CEP by choosing a particular set of parameters. Amongst its many attractive features one important CEP attribute is that it operates on one layer of input weights at a time while other connected weights are either frozen or deterministically calculable, hence provides fast learning. Finally, with limited synaptic weight resolution of only 3- to 4-bit, this technique is capable of learning reliably 5- to 8-bit parity problems by incorporating additional hidden units
  • Keywords
    VLSI; analogue integrated circuits; analogue processing circuits; cascade systems; correlation methods; learning (artificial intelligence); neural chips; optimisation; pattern recognition; VLSI silicon; artificial neural networks; cascade correlation learning algorithm; cascade error projection; circuit noise; classification; highly parallel connectivity; ill-defined problems; learning algorithm; massive parallelism; optimization; parity problems; pattern recognition; real-time low-power operation; synapse resolution; Convergence; Mathematical analysis; Neural network hardware; Neural networks; Neurons; Pattern recognition; Silicon; Space technology; Transfer functions; Very large scale integration;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 1996., IEEE International Conference on
  • Conference_Location
    Washington, DC
  • Print_ISBN
    0-7803-3210-5
  • Type

    conf

  • DOI
    10.1109/ICNN.1996.548896
  • Filename
    548896