DocumentCode :
288766
Title :
Classification by balanced, linearly separable representation
Author :
Baram, Yoram
Author_Institution :
Dept. of Comput. Sci., Technion-Israel Inst. of Technol., Haifa, Israel
Volume :
5
fYear :
1994
fDate :
27 Jun-2 Jul 1994
Firstpage :
3032
Abstract :
Classifiers for binary and for real-valued data, consisting of a single internal layer of spherical threshold cells, are completely defined by two fundamental requirements: linear separability of the internal representations, which defines the cells´ activation threshold, and input-space covering, which defines the minimal number of cells required. Class assignments are learnt by applying Rosenblatt´s learning rule to the internal representations which are balanced, having equally probable bit values. The separation capacity may be increased by increasing the number-of cells, at a possible cost in generalization. Our analysis extends to the classification of binarized symbolic (or enumerated) data and explains an empirical observation made in the literature on the separability of such data
Keywords :
learning (artificial intelligence); neural nets; pattern classification; probability; symbol manipulation; Rosenblatt learning rule; binary data classification; class assignments; generalization; internal layer; linear separability; linearly separable representation; real-valued data classification; spherical threshold cells; Computer architecture; Costs; Learning systems; NASA; Probability; Space technology; Vectors;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1994. IEEE World Congress on Computational Intelligence., 1994 IEEE International Conference on
Conference_Location :
Orlando, FL
Print_ISBN :
0-7803-1901-X
Type :
conf
DOI :
10.1109/ICNN.1994.374716
Filename :
374716
Link To Document :
بازگشت