• DocumentCode
    423649
  • Title

    On learning a function of perceptrons

  • Author

    Anthony, Martin

  • Author_Institution
    Dept. of Math., London Sch. of Econ., UK
  • Volume
    2
  • fYear
    2004
  • fDate
    25-29 July 2004
  • Firstpage
    967
  • Abstract
    This paper concerns the generalization accuracy when training a classifier that is a fixed Boolean function of the outputs of a number of perceptrons. The analysis involves the ´margins´ achieved by the constituent perceptrons on the training data. A special case is that in which the fixed Boolean function is the majority function (where we have a ´committee of perceptrons´). Recent work of Auer et al. studied the computational properties of such networks (where they were called ´parallel perceptrons´), proposed an incremental learning algorithm for them. The results given here provide further motivation for the use of this learning rule.
  • Keywords
    Boolean functions; error analysis; generalisation (artificial intelligence); learning (artificial intelligence); multilayer perceptrons; pattern classification; Boolean function; classifier training; generalization error bounds; incremental learning algorithm; parallel perceptrons; Artificial neural networks; Boolean functions; Circuits; Computer networks; Concurrent computing; Mathematics; Training data;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 2004. Proceedings. 2004 IEEE International Joint Conference on
  • ISSN
    1098-7576
  • Print_ISBN
    0-7803-8359-1
  • Type

    conf

  • DOI
    10.1109/IJCNN.2004.1380064
  • Filename
    1380064