Title :
The convergent learning of three-layer artificial neural networks for any binary-to-binary mapping
Author :
Kim, In Sook ; Park, Sung-Kwon ; In Sook Kim
Author_Institution :
Center for Adv. Comput. Studies, Southwestern Louisiana Univ., Lafayette, LA, USA
fDate :
27 Jun-2 Jul 1994
Abstract :
In this paper, the learning algorithm called expand-and-truncate learning (ETL) is proposed to train three-layer binary neural networks (BNN) with guaranteed convergence for any binary-to-binary mapping. The most significant contribution of this paper is the development of learning algorithm for three-layer BNN which guarantees the convergence, automatically determining a required number of neurons in the hidden layer. Furthermore, the learning speed of the proposed ETL algorithm is much faster than that of backpropagation learning algorithm in a binary field. Neurons in the proposed BNN employ a hard-limiter activation function, only integer weights and integer thresholds. Therefore, this will greatly facilitate actual hardware implementation of the proposed BNN using currently available digital VLSI technology
Keywords :
convergence; feedforward neural nets; learning (artificial intelligence); binary-to-binary mapping; convergence; convergent learning; expand-and-truncate learning; hard-limiter activation function; integer thresholds; integer weights; learning algorithm; three-layer neural networks; Artificial neural networks; Convergence; Guidelines; Hardware; Neural networks; Neurons; Power line communications; Space technology; Very large scale integration;
Conference_Titel :
Neural Networks, 1994. IEEE World Congress on Computational Intelligence., 1994 IEEE International Conference on
Conference_Location :
Orlando, FL
Print_ISBN :
0-7803-1901-X
DOI :
10.1109/ICNN.1994.375044