DocumentCode :
1903287
Title :
Some notes on perceptron learning
Author :
Budinich, Marco
Author_Institution :
Dipartimento de Fisica, Trieste Univ., Italy
fYear :
1993
fDate :
1993
Firstpage :
371
Abstract :
Using a geometrical approach to the perceptron it is shown that, given n examples, learning is of maximal difficulty when the number of inputs, d, is such that n=5d. A modified perceptron algorithm that takes advantage of the pecularities of the cost function is presented. It is more than two times faster than the standard one. It does not have fixed parameters, like the usual learning constant η, but it adapts them to the cost function. It is shown that there exists an optimal choice for β, the steepness of the transfer function. A brief systematic study of the parameters η and β of the standard perceptron algorithm is presented
Keywords :
learning (artificial intelligence); neural nets; transfer functions; cost function; optimal choice; perceptron algorithm; perceptron learning; transfer function; Cost function; Error correction; Performance gain; Resumes; Testing; Transfer functions;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1993., IEEE International Conference on
Conference_Location :
San Francisco, CA
Print_ISBN :
0-7803-0999-5
Type :
conf
DOI :
10.1109/ICNN.1993.298585
Filename :
298585
Link To Document :
بازگشت