Title :
High frequency rough set model based on database systems
Author :
Vaithyanathan, Kartik ; Lin, T.Y.
Author_Institution :
Dept. of Comput. Sci., San Jose State Univ., San Jose, CA
Abstract :
Rough sets theory was proposed by Pawlak in the 1980s and has been applied successfully in a lot of domains. One of the key concepts of the rough sets model is the computation of core and reduct. It has been shown that finding the minimal reduct is an NP-hard problem and its computational complexity has implicitly restricted its effective applications to a small and clean data set. In order to improve the efficiency of computing core attributes and reducts, many novel approaches have been developed, some of which attempt to integrate database technologies. This paper proposes a novel approach to computing reducts called high frequency value reducts using database system concepts. The method deals directly with generating value reducts and also prunes the decision table by placing a lower bound on the frequency of equivalence values in the decision table.
Keywords :
computational complexity; database management systems; decision tables; mathematics computing; rough set theory; NP-hard problem; computational complexity; database systems; decision table; equivalence value; high frequency rough set model; Computational complexity; Computational modeling; Computer science; Data mining; Database systems; Frequency; NP-hard problem; Relational databases; Rough sets; Sorting;
Conference_Titel :
Fuzzy Information Processing Society, 2008. NAFIPS 2008. Annual Meeting of the North American
Conference_Location :
New York City, NY
Print_ISBN :
978-1-4244-2351-4
Electronic_ISBN :
978-1-4244-2352-1
DOI :
10.1109/NAFIPS.2008.4531351