DocumentCode :
1242173
Title :
The geometrical learning of binary neural networks
Author :
Kim, Jung H. ; Park, Sung-Kwon
Author_Institution :
Center for Adv. Comput. Studies, Southwestern Louisiana Univ., Lafayette, LA, USA
Volume :
6
Issue :
1
fYear :
1995
fDate :
1/1/1995 12:00:00 AM
Firstpage :
237
Lastpage :
247
Abstract :
In this paper, the learning algorithm called expand-and-truncate learning (ETL) is proposed to train multilayer binary neural networks (BNN) with guaranteed convergence for any binary-to-binary mapping. The most significant contribution of this paper is the development of a learning algorithm for three-layer BNN which guarantees the convergence, automatically determining a required number of neurons in the hidden layer. Furthermore, the learning speed of the proposed ETL algorithm is much faster than that of backpropagation learning algorithm in a binary field. Neurons in the proposed BNN employ a hard-limiter activation function, with only integer weights and integer thresholds. Therefore, this will greatly facilitate actual hardware implementation of the proposed BNN using currently available digital VLSI technology
Keywords :
convergence; learning (artificial intelligence); neural nets; binary neural networks; binary-to-binary mapping; digital VLSI technology; expand-and-truncate learning; geometrical learning; guaranteed convergence; hard-limiter activation function; hidden layer; Convergence; Guidelines; Hardware; Multi-layer neural network; Neural networks; Neurons; Power line communications; Space technology; Very large scale integration;
fLanguage :
English
Journal_Title :
Neural Networks, IEEE Transactions on
Publisher :
ieee
ISSN :
1045-9227
Type :
jour
DOI :
10.1109/72.363432
Filename :
363432
Link To Document :
بازگشت