• DocumentCode
    1748827
  • Title

    Meta-learning with backpropagation

  • Author

    Younger, A. Steven ; Hochreiter, Sepp ; Conwell, Peter R.

  • Author_Institution
    Dept. of Comput. Sci., Colorado Univ., Boulder, CO, USA
  • Volume
    3
  • fYear
    2001
  • fDate
    2001
  • Firstpage
    2001
  • Abstract
    Introduces gradient descent methods applied to meta-learning (learning how to learn) in neural networks. Meta-learning has been of interest in the machine learning field for decades because of its appealing applications to intelligent agents, non-stationary time series, autonomous robots, and improved learning algorithms. Many previous neural network-based approaches toward meta-learning have been based on evolutionary methods. We show how to use gradient descent for meta-learning in recurrent neural networks. Based on previous work on fixed-weight learning neural networks, we hypothesize that any recurrent network topology and its corresponding learning algorithm(s) is a potential meta-learning system. We tested several recurrent neural network topologies and their corresponding forms of backpropagation for their ability to meta-learn. One of our systems, based on the long short-term memory neural network developed a learning algorithm that could learn any two-dimensional quadratic function (from a set of such functions) after only 30 training examples
  • Keywords
    backpropagation; recurrent neural nets; backpropagation; fixed-weight learning neural networks; gradient descent methods; long short-term memory neural network; meta-learning; recurrent neural networks; two-dimensional quadratic function; Backpropagation algorithms; Cities and towns; Computer science; Educational institutions; Network topology; Neural networks; Physics; Recurrent neural networks; Robots; USA Councils;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 2001. Proceedings. IJCNN '01. International Joint Conference on
  • Conference_Location
    Washington, DC
  • ISSN
    1098-7576
  • Print_ISBN
    0-7803-7044-9
  • Type

    conf

  • DOI
    10.1109/IJCNN.2001.938471
  • Filename
    938471