DocumentCode :
1277741
Title :
Relative loss bounds for single neurons
Author :
Helmbold, David P. ; Kivinen, Jyrki ; Warmuth, Manfred K.
Author_Institution :
Dept. of Comput. Sci., California Univ., Santa Cruz, CA, USA
Volume :
10
Issue :
6
fYear :
1999
fDate :
11/1/1999 12:00:00 AM
Firstpage :
1291
Lastpage :
1304
Abstract :
We analyze and compare the well-known gradient descent algorithm and the more recent exponentiated gradient algorithm for training a single neuron with an arbitrary transfer function. Both algorithms are easily generalized to larger neural networks, and the generalization of gradient descent is the standard backpropagation algorithm. We prove worst-case loss bounds for both algorithms in the single neuron case. Since local minima make it difficult to prove worst case bounds for gradient-based algorithms, we must use a loss function that prevents the formation of spurious local minima. We define such a matching loss function for any strictly increasing differentiable transfer function and prove worst-case loss bounds for any such transfer function and its corresponding matching loss. The different forms of the two algorithms´ bounds indicates that exponentiated gradient outperforms gradient descent when the inputs contain a large number of irrelevant components. Simulations on synthetic data confirm these analytical results
Keywords :
backpropagation; generalisation (artificial intelligence); gradient methods; neural nets; statistical analysis; transfer functions; backpropagation; exponential gradient algorithm; generalization; gradient descent algorithm; linear regression; matching loss; matching loss function; neural networks; relative loss bounds; sigmoid transfer function; worst-case loss bounds; Algorithm design and analysis; Backpropagation algorithms; Computer science; Input variables; Linear regression; Logistics; Neural networks; Neurons; Transfer functions; Vectors;
fLanguage :
English
Journal_Title :
Neural Networks, IEEE Transactions on
Publisher :
ieee
ISSN :
1045-9227
Type :
jour
DOI :
10.1109/72.809075
Filename :
809075
Link To Document :
بازگشت