• DocumentCode
    288359
  • Title

    A note on backpropagation, projection learning, and feedback in neural systems

  • Author

    Weigl, Konrad

  • Author_Institution
    INRIA, Sophia Antipolis, France
  • Volume
    1
  • fYear
    1994
  • fDate
    27 Jun-2 Jul 1994
  • Firstpage
    391
  • Abstract
    We show instances where parts of algorithms similar to backpropagation respectively projection learning algorithms have been implemented via feedback in neural systems. The corresponding algorithms, with the same or a similar mathematical expression, do not minimize an error in the output space of the network, but rather in the input space of the network, via a comparison between the function to be approximated and the current approximation executed by the network, which is fed back to the input space: We argue that numerous interlayer resp. intracortical feedback connections, e.g. in the visual primary system of mammals, could serve exactly this purpose. We introduce the paradigm with linear operators for illustration purposes, show the extension to nonlinear operators in function space, introduce projection learning, and discuss future work
  • Keywords
    backpropagation; function approximation; learning (artificial intelligence); neural nets; backpropagation; feedback; function approximation; function space; input space; intracortical feedback connections; linear operators; mammals; mathematical expression; neural systems; nonlinear operators; output space; projection learning; visual primary system; Approximation algorithms; Backpropagation algorithms; Equations; Gabor filters; Helium; Neurofeedback; Neurons; Output feedback; Pixel; Vectors;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 1994. IEEE World Congress on Computational Intelligence., 1994 IEEE International Conference on
  • Conference_Location
    Orlando, FL
  • Print_ISBN
    0-7803-1901-X
  • Type

    conf

  • DOI
    10.1109/ICNN.1994.374194
  • Filename
    374194