• DocumentCode
    285138
  • Title

    Growing layers of perceptrons: introducing the Extentron algorithm

  • Author

    Baffes, Paul T. ; Zelle, John M.

  • Author_Institution
    Dept. of Comput. Sci., Texas Univ., Austin, TX, USA
  • Volume
    2
  • fYear
    1992
  • fDate
    7-11 Jun 1992
  • Firstpage
    392
  • Abstract
    Concepts based on two observations of perceptrons are presented. When the perceptron learning algorithm cycles among hyperplanes, the hyperplanes may be compared in order to select one that gives a best split of the examples, and it is always possible for the perceptron to build a hyperplane that separates at least one example from all the rest. The authors describe the Extentron, which grows multi-layer networks capable of distinguishing nonlinearly separable data using the simple perceptron rule for linear threshold units. The resulting algorithm is simple, very fast, scales well to large problems, retains the convergence properties of the perceptron, and can be completely specified using only two parameters. Results are presented comparing the Extentron to other neural network paradigms and to symbolic learning systems
  • Keywords
    learning (artificial intelligence); neural nets; Extentron algorithm; best split; convergence properties; hyperplanes; learning algorithm; linear threshold units; multilayer networks; nonlinearly separable data; perceptrons; symbolic learning systems; Convergence; Joining processes; Learning systems; Multilayer perceptrons; Network topology; Neural networks;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 1992. IJCNN., International Joint Conference on
  • Conference_Location
    Baltimore, MD
  • Print_ISBN
    0-7803-0559-0
  • Type

    conf

  • DOI
    10.1109/IJCNN.1992.226956
  • Filename
    226956