• DocumentCode
    1816636
  • Title

    The geometrical learning of multi-layer artificial neural networks with guaranteed convergence

  • Author

    Kim, Jung H. ; Park, Sung-Kwon

  • Author_Institution
    Center for Adv. Comput. Studies, Univ. of Southwestern Louisiana, Lafayette, LA, USA
  • Volume
    1
  • fYear
    1992
  • fDate
    7-11 Jun 1992
  • Firstpage
    871
  • Abstract
    A learning algorithm called geometrical expanding learning (GEL) is proposed to train multilayer artificial neural networks (ANNs) with guaranteed convergence for an arbitrary function in a binary field. It is noted that there has not yet been found a learning algorithm for a three-layer ANN which guarantees convergence. The most significant contribution of the proposed research is the development of a learning algorithm for multilayer ANNs which guarantees convergence and automatically determines the required number of neurons. The learning speed of the proposed GEL algorithm is much faster than that of the backpropagation learning algorithm in a binary field
  • Keywords
    feedforward neural nets; learning (artificial intelligence); arbitrary function; binary field; geometrical expanding learning; geometrical learning; guaranteed convergence; multilayer artificial neural networks; neurons; Artificial neural networks; Backpropagation algorithms; Convergence; Hardware; Input variables; Neurons; Out of order; Power line communications; Very large scale integration;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 1992. IJCNN., International Joint Conference on
  • Conference_Location
    Baltimore, MD
  • Print_ISBN
    0-7803-0559-0
  • Type

    conf

  • DOI
    10.1109/IJCNN.1992.287077
  • Filename
    287077