DocumentCode :
910780
Title :
Adaptive Ho-Kashyap rules for perceptron training
Author :
Hassoun, Mohamad H. ; Song, Jing
Author_Institution :
Dept. of Electr. & Comput. Eng., Wayne State Univ., Detroit, MI, USA
Volume :
3
Issue :
1
fYear :
1992
fDate :
1/1/1992 12:00:00 AM
Firstpage :
51
Lastpage :
61
Abstract :
Three adaptive versions of the Ho-Kashyap perceptron training algorithm are derived based on gradient descent strategies. These adaptive Ho-Kashyap (AHK) training rules are comparable in their complexity to the LMS and perceptron training rules and are capable of adaptively forming linear discriminant surfaces that guarantee linear separability and of positioning such surfaces for maximal classification robustness. In particular, a derived version called AHK II is capable of adaptively identifying critical input vectors lying close to class boundaries in linearly separable problems. The authors extend this algorithm as AHK III, which adds the capability of fast convergence to linear discriminant surfaces which are good approximations for nonlinearly separable problems. This is achieved by a simple built-in unsupervised strategy which allows for the adaptive grading and discarding of input vectors causing nonseparability. Performance comparisons with LMS and perceptron training are presented
Keywords :
adaptive systems; learning systems; neural nets; AHK II; Ho-Kashyap rules; adaptive versions; class boundaries; critical input vectors; discarding; fast convergence; gradient descent strategies; linear discriminant surfaces; linear separability; perceptron training; unsupervised strategy; Convergence; Error analysis; Error correction; Helium; Least squares approximation; Linear approximation; Robustness; Signal design; Signal mapping; Vectors;
fLanguage :
English
Journal_Title :
Neural Networks, IEEE Transactions on
Publisher :
ieee
ISSN :
1045-9227
Type :
jour
DOI :
10.1109/72.105417
Filename :
105417
Link To Document :
بازگشت