• DocumentCode
    2018925
  • Title

    Differential learning leads to efficient neural network classifiers

  • Author

    Hampshire, J.B., II ; Kumar, B. K K Vjaya

  • Author_Institution
    Dept. of Electr. & Comput. Eng., Carnegie Mellon Univ., Pittsburgh, PA, USA
  • Volume
    1
  • fYear
    1993
  • fDate
    27-30 April 1993
  • Firstpage
    613
  • Abstract
    The authors outline a differential theory of learning for statistical pattern classification. The theory is based on classification figure-of-merit (CFM) objective functions, described by J. P. Hampshire II and A. H. Waibel (IEEE Trans. Neural Netw. vol.1, no.2, p.216-218, June 1990). They give the proof that differential learning is efficient, requiring the least classifier complexity and the smallest training sample size necessary to achieve Bayesian (i.e., minimum error) discrimination. A practical application of the theory is included in which a simple differentially trained linear neural network classifier discriminations handwritten digits of the AT&T DB1 database with a 1.3% error rate. This error rate is less than one half of the best previous result for a linear classifier on this optical character recognition (OCR) task.<>
  • Keywords
    Bayes methods; computational complexity; learning (artificial intelligence); neural nets; optical character recognition; Bayesian discrimination; IEEE Trans. Neural Netw.; classifier complexity; differential theory of learning; error rate; handwritten digits; linear neural network classifier; objective functions; optical character recognition; statistical pattern classification; training sample size;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Acoustics, Speech, and Signal Processing, 1993. ICASSP-93., 1993 IEEE International Conference on
  • Conference_Location
    Minneapolis, MN, USA
  • ISSN
    1520-6149
  • Print_ISBN
    0-7803-7402-9
  • Type

    conf

  • DOI
    10.1109/ICASSP.1993.319193
  • Filename
    319193