• DocumentCode
    285141
  • Title

    Universal property of learning curves under entropy loss

  • Author

    Amari, Shun-Ichi

  • Author_Institution
    Fac. of Eng., Tokyo Univ., Japan
  • Volume
    2
  • fYear
    1992
  • fDate
    7-11 Jun 1992
  • Firstpage
    368
  • Abstract
    A learning curve shows how fast a learning machine improves it behaviour as the number of training examples increases. A study of the universal asymptotic behaviour of learning curves for general dichotomy machines is presented. It is proved rigorously that the average predictive entropy <e*(t)> converges to zero as <e*(t)>~d/t as the number of t of training examples increases, where d is the number of modifiable parameters of a machine, irrespectively of the architecture of the machine
  • Keywords
    learning (artificial intelligence); neural nets; average predictive entropy; entropy loss; general dichotomy machines; learning curves; modifiable parameters; training examples; universal asymptotic behaviour; universal property; Annealing; Bayesian methods; Entropy; Error correction; Machine learning; Neural networks; Probability distribution; Stochastic processes;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 1992. IJCNN., International Joint Conference on
  • Conference_Location
    Baltimore, MD
  • Print_ISBN
    0-7803-0559-0
  • Type

    conf

  • DOI
    10.1109/IJCNN.1992.226960
  • Filename
    226960