DocumentCode :
2302492
Title :
On learning multiple descriptions of a concept
Author :
Ali, Kamal ; Brunk, Clifford ; Pazzani, Michael
Author_Institution :
Dept. of Inf. & Comput. Sci., California Univ., Irvine, CA, USA
fYear :
1994
fDate :
6-9 Nov 1994
Firstpage :
476
Lastpage :
483
Abstract :
In sparse data environments, greater classification accuracy can be achieved by learning several concept descriptions of the data and combining their classifications. Stochastic searching can be used to generate many concept descriptions (rule sets) for each class in the data. We use a tractable approximation to the optimal Bayesian method for combining classifications from such descriptions. The primary result of this paper is that multiple concept descriptions are particularly helpful in “flat” hypothesis spaces in which there are many equally good ways to grow a rule, each having similar gain. Another result is experimental evidence that learning multiple rule sets yields more accurate classifications than learning multiple rules for some domains
Keywords :
Bayes methods; classification; data description; learning (artificial intelligence); search problems; HYDRA; classification accuracy; flat hypothesis spaces; gain; multiple concept descriptions learning; multiple rule sets; optimal Bayesian method; rule growth; rule sets; sparse data environments; stochastic searching; tractable approximation; Bayesian methods; Computer science; Decision trees; Finite element methods; Partitioning algorithms; Stochastic processes; Voting;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Tools with Artificial Intelligence, 1994. Proceedings., Sixth International Conference on
Conference_Location :
New Orleans, LA
Print_ISBN :
0-8186-6785-0
Type :
conf
DOI :
10.1109/TAI.1994.346454
Filename :
346454
Link To Document :
بازگشت