DocumentCode
480753
Title
An Effective Evidence Theory Based K-Nearest Neighbor (KNN) Classification
Author
Wang, Lei ; Khan, Latifur ; Thuraisingham, Bhavani
Author_Institution
Dept. of Comput. Sci., Univ. of Texas at Dallas, Dallas, TX
Volume
1
fYear
2008
fDate
9-12 Dec. 2008
Firstpage
797
Lastpage
801
Abstract
In this paper, we study various K nearest neighbor (KNN) algorithms and present a new KNN algorithm based on evidence theory. We introduce global frequency estimation of prior probability (GE) and local frequency estimation of prior probability (LE). A GE for a class is the prior probability of the class across the whole training data space based on frequency estimation; on the other hand, a LE for a class in a particular neighborhood is the prior probability of the class in this neighborhood space based on frequency estimation. By considering the difference between the GE and the LE of each class, we present a solution to the imbalanced data problem in some degree without doing re-sampling. We compare our algorithm with other KNN algorithms using two benchmark datasets. Results show that our KNN algorithm outperforms other KNN algorithms, including basic evidence based KNN.
Keywords
frequency estimation; learning (artificial intelligence); pattern classification; probability; data space training; evidence theory; global prior probability frequency estimation; k-nearest neighbor classification; local prior probability frequency estimation; Benchmark testing; Classification algorithms; Computer science; Frequency estimation; Intelligent agent; Nearest neighbor searches; Partitioning algorithms; Support vector machine classification; Support vector machines; Training data; Classification; Evidence Theory; KNN;
fLanguage
English
Publisher
ieee
Conference_Titel
Web Intelligence and Intelligent Agent Technology, 2008. WI-IAT '08. IEEE/WIC/ACM International Conference on
Conference_Location
Sydney, NSW
Print_ISBN
978-0-7695-3496-1
Type
conf
DOI
10.1109/WIIAT.2008.411
Filename
4740552
Link To Document