Author_Institution :
Fac. of Comput. Sci., China Univ. of Geosci., Wuhan, China
Abstract :
Naive Bayes (simply NB) (Langley et al., 1992) has been widely used in machine learning and data mining as a simple and effective classification algorithm. Since its conditional independence assumption is rarely true, researchers have made a substantial amount of effort to improve naive Bayes. The related research work can be broadly divided into two approaches: eager learning and lazy learning, depending on when the major computation occurs. Different from eager approach, the key idea for extending naive Bayes from the lazy approach is to learn a naive Bayes for each testing example. In recent years, some lazy extensions of naive Bayes have been proposed. For example, SNNB, LWNB, and LBR. All are aiming at improving the classification accuracy of naive Bayes. In many real-world machine learning and data mining applications, however, an accurate ranking is more desirable than an accurate classification. Responding to this fact, we present a lazy learning algorithm called instance greedily cloning naive Bayes (simply IGCNB) in this paper. Our motivation is to improve naive Bayes´ ranking performance measured by AUC (Bradley, 1997; Provost and Fawcett, 1997). We experimentally tested our algorithm, using the whole 36 UCI datasets recommended by Weka, and compared it to C4.4 (Provost and Domingos, 2003), NB (Langley et al., 1992), SNNB (Xie, 2002) and LWNB (Frank, 2003). The experimental results show that our algorithm outperforms all the other algorithms used to compare significantly in yielding accurate ranking.
Keywords :
Bayes methods; data mining; learning (artificial intelligence); classification algorithm; data mining; eager learning; instance greedily cloning naive Bayes; lazy learning; machine learning; Bayesian methods; Classification algorithms; Cloning; Computer science; Data mining; Geology; Machine learning; Machine learning algorithms; Niobium; Testing;