Title :
Gradient descent in feed-forward networks with binary neurons
Author :
Costa, Mario ; Palmisano, Davide ; Pasero, Eros
Author_Institution :
Dipt. di Elettronica, Politecnico di Torino, Italy
Abstract :
In this paper we show how the familiar concept of gradient descent can be extended in presence of binary neurons. The procedure we devised formally operates on generic feedforward networks of logistic-like neurons whose activations are re-scaled by an arbitrarily large gauge. Whereas the gradient decays exponentially with increasing values of the gauge, the sign of each component becomes definitely equal to a constant value. Those values are actually computed by means of a “twin” network of binary neurons. This allows the application of any “Manhattan” training algorithm such as resilient propagation
Keywords :
feedforward neural nets; gradient methods; learning (artificial intelligence); Manhattan training algorithm; binary neurons; feedforward networks; gradient descent; logistic-like neurons; resilient propagation; twin neural network; Computer architecture; Computer networks; Cost function; Feedforward systems; Intelligent networks; Logistics; Neurons; Parallel processing; Signal resolution; Turning;
Conference_Titel :
Neural Networks, 2000. IJCNN 2000, Proceedings of the IEEE-INNS-ENNS International Joint Conference on
Conference_Location :
Como
Print_ISBN :
0-7695-0619-4
DOI :
10.1109/IJCNN.2000.857854