Title :
Growing subspace pattern recognition methods and their neural-network models
Author :
Prakash, Mangal ; Murt, M. Narasimha
Author_Institution :
Dept. of Comput. Sci. & Autom., Indian Inst. of Sci., Bangalore, India
fDate :
1/1/1997 12:00:00 AM
Abstract :
In statistical pattern recognition, the decision of which features to use is usually left to human judgment. If possible, automatic methods are desirable. Like multilayer perceptrons, learning subspace methods (LSMs) have the potential to integrate feature extraction and classification. In this paper, we propose two new algorithms, along with their neural-network implementations, to overcome certain limitations of the earlier LSMs. By introducing one cluster at a time and adapting it if necessary, we eliminate one limitation of deciding how many clusters to have in each class by trial-and-error. By using the principal component analysis neural networks along with this strategy, we propose neural-network models which are better in overcoming another limitation, scalability. Our results indicate that the proposed classifiers are comparable to classifiers like the multilayer perceptrons and the nearest-neighbor classifier in terms of classification accuracy. In terms of classification speed and scalability in design, they appear to be better for large-dimensional problems
Keywords :
feature extraction; neural nets; pattern classification; statistical analysis; classification speed; feature extraction; large-dimensional problems; learning subspace methods; multilayer perceptrons; nearest-neighbor classifier; neural-network models; principal component analysis neural networks; scalability; statistical pattern recognition; subspace pattern recognition methods; Artificial neural networks; Character recognition; Feature extraction; Learning systems; Multi-layer neural network; Multilayer perceptrons; Neural networks; Pattern recognition; Principal component analysis; Scalability;
Journal_Title :
Neural Networks, IEEE Transactions on