• DocumentCode
    2030489
  • Title

    Projection learning of the minimum variance type

  • Author

    Hirabayaski, A. ; Ogawa, Hidemitsu

  • Author_Institution
    Dept. of Comput. Sci., Tokyo Inst. of Technol., Japan
  • Volume
    3
  • fYear
    1999
  • fDate
    1999
  • Firstpage
    1172
  • Abstract
    Proposes a new learning method for supervised learning, named minimum variance projection learning (MVPL). Due to noise in the training examples, the resultant functions are not uniquely determined in general, and are distributed around a function obtained from noiseless training examples. The smaller the variance of the distribution, the more stable results that can be obtained. MVPL is a learning method which, in a family of projection learnings, minimizes the variance of the distribution. We clarify the properties of MVPL and illustrate its effectiveness by computer simulation
  • Keywords
    learning (artificial intelligence); minimisation; noise; stability; virtual machines; MVPL; computer simulation; distribution variance minimization; minimum variance projection learning; noise; nonuniquely determined functions; projection learning; stable results; supervised learning; training examples; Additive noise; Computer science; Computer simulation; Function approximation; Hilbert space; Inverse problems; Kernel; Learning systems; Machine learning; Supervised learning;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Information Processing, 1999. Proceedings. ICONIP '99. 6th International Conference on
  • Conference_Location
    Perth, WA
  • Print_ISBN
    0-7803-5871-6
  • Type

    conf

  • DOI
    10.1109/ICONIP.1999.844702
  • Filename
    844702