Title :
Neural nonlinear classifiers with synaptic weight commitment
Author :
Diamantaras, Konstantinos I. ; Strintzis, Michael G.
Author_Institution :
Dept. of Electr. & Comput. Eng., Aristotelian Univ. of Thessaloniki, Greece
Abstract :
We develop a theory for immediately detecting patterns violating linear separability in a training set, as soon as they are presented to the classifier. The theory is based on the computation of the solution cone in the weight space of a single linear threshold unit (LTU). The separability-violating patterns can be skipped and in the end, we obtain a linearly separable subset of the original training set together with the subset´s solution cone. We further propose an iterative algorithm for computing the solution cone and we introduce the concept of garbage collection for removing redundancy from the cone representation. The algorithm can be implemented by a novel neural network architecture where the synaptic weights have constant values throughout the life of the synapse. We show that by combining multiple such models we can construct a larger network capable of learning the classification boundaries of convex classes, therefore solving an important class of nonlinearly separable problems
Keywords :
neural nets; pattern classification; convex class; garbage collection; iterative algorithm; linear separability violation; linear threshold unit; neural network; nonlinear classifier; pattern detection; redundancy; solution cone; synaptic weight commitment; training set; Biological system modeling; Complex networks; Computer architecture; Evolution (biology); Hospitals; Information systems; Iterative algorithms; Neural networks; Robustness; Vectors;
Conference_Titel :
Circuits and Systems, 1997. ISCAS '97., Proceedings of 1997 IEEE International Symposium on
Print_ISBN :
0-7803-3583-X
DOI :
10.1109/ISCAS.1997.608914