Title :
Data classification with a generalized Gaussian components based density estimation algorithm
Author :
Hsieh, Chih-Hung ; Chang, Darby Tien-Hao ; Oyang, Yen-Jen
Author_Institution :
Dept. of Comput. Sci. & Inf. Eng., Nat. Taiwan Univ., Taipei, Taiwan
Abstract :
Data classification is an intensively studied machine learning problem and there are two major categories of data classification algorithms, namely the logic based and the kernel based. The logic based classifiers, such as the decision tree and the rule-based classifier, feature the advantage of presenting a good summary about the distinctive characteristics of different classes of data. On the other hand, the kernel based classifiers, such as the neural network and the support vector machine (SVM), typically can deliver higher prediction accuracy than the logic based classifiers. However, the user of a kernel based classifier normally cannot get an overall picture about the distribution of the data set. For some applications, the overall picture of the distribution of the data set can provide valuable insights about the distinctive characteristics of different classes of data and therefore is highly desirable. In this article, aiming to close the gap between the logic based classifiers and the kernel based classifiers, we propose a novel approach to carry out density estimation based on a mixture model composed of a limited number of generalized Gaussian components. One favorite feature of the classifier constructed with the proposed approach is that a user can easily obtain an overall picture of the distributions of the data set by examining the eigenvectors and eigenvalues of the covariance matrices associated with the generalized Gaussian components. Experimental results show that the classifier constructed with the proposed approach is capable of delivering superior prediction accuracy in comparison with the conventional logic based classifiers and the EM (Expectation Maximization) based classifier. On the other hand, though it cannot match the prediction accuracy delivered by the SVM, the proposed classifier enjoys one major advantage due to providing the user with an overall picture of the underlying distributions.
Keywords :
covariance matrices; data handling; decision trees; eigenvalues and eigenfunctions; expectation-maximisation algorithm; learning (artificial intelligence); neural nets; pattern classification; support vector machines; covariance matrices; data classification algorithms; data set distribution; decision tree; density estimation algorithm; eigenvalues; eigenvectors; expectation maximization based classifier; generalized Gaussian components; kernel based classifiers; logic based classifiers; machine learning problem; neural network; rule-based classifier; support vector machine; Accuracy; Classification algorithms; Classification tree analysis; Decision trees; Kernel; Logic; Machine learning; Machine learning algorithms; Support vector machine classification; Support vector machines;
Conference_Titel :
Neural Networks, 2009. IJCNN 2009. International Joint Conference on
Conference_Location :
Atlanta, GA
Print_ISBN :
978-1-4244-3548-7
Electronic_ISBN :
1098-7576
DOI :
10.1109/IJCNN.2009.5179000