DocumentCode :
2259646
Title :
Gradient descent in feed-forward networks with binary neurons
Author :
Costa, Mario ; Palmisano, Davide ; Pasero, Eros
Author_Institution :
Dipt. di Elettronica, Politecnico di Torino, Italy
Volume :
1
fYear :
2000
fDate :
2000
Firstpage :
311
Abstract :
In this paper we show how the familiar concept of gradient descent can be extended in presence of binary neurons. The procedure we devised formally operates on generic feedforward networks of logistic-like neurons whose activations are re-scaled by an arbitrarily large gauge. Whereas the gradient decays exponentially with increasing values of the gauge, the sign of each component becomes definitely equal to a constant value. Those values are actually computed by means of a “twin” network of binary neurons. This allows the application of any “Manhattan” training algorithm such as resilient propagation
Keywords :
feedforward neural nets; gradient methods; learning (artificial intelligence); Manhattan training algorithm; binary neurons; feedforward networks; gradient descent; logistic-like neurons; resilient propagation; twin neural network; Computer architecture; Computer networks; Cost function; Feedforward systems; Intelligent networks; Logistics; Neurons; Parallel processing; Signal resolution; Turning;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 2000. IJCNN 2000, Proceedings of the IEEE-INNS-ENNS International Joint Conference on
Conference_Location :
Como
ISSN :
1098-7576
Print_ISBN :
0-7695-0619-4
Type :
conf
DOI :
10.1109/IJCNN.2000.857854
Filename :
857854
Link To Document :
بازگشت