DocumentCode
1648789
Title
Inductive concept learning in the absence of labeled counter-examples
Author
Skabar, Andrew ; Biswas, Kousick ; Pham, Binh ; Maeder, Anthony
Author_Institution
Sch. of Eng., Ballarat Univ., Vic., Australia
fYear
2000
fDate
6/22/1905 12:00:00 AM
Firstpage
220
Lastpage
226
Abstract
Supervised machine learning techniques generally require that the training set on which learning is based contains sufficient examples representative of the target concept, as well as known counter-examples of the concept. However in many application domains it is not possible to supply a set of labeled counter-examples. This paper presents a technique that combines supervised and unsupervised learning to discover symbolic concept descriptions from a training set in which only positive instances appear with class labels. Experimental results obtained from applying the technique to several real world datasets are provided. These results suggest that in some problems domain learning without labeled counter-examples can lead to classification performance comparable to that of conventional learning algorithms, despite the fact that the latter use additional class information. The technique is able to cope with noise in the training set, and is applicable to a broad range of classification and pattern recognition problems
Keywords
learning by example; pattern recognition; class labels; classification performance; datasets; domain learning; inductive concept learning; labeled counter-examples; pattern recognition; supervised machine learning; symbolic concept descriptions; training set; unsupervised learning; Australia; Counting circuits; Databases; Decision trees; Law; Legal factors; Read only memory; Supervised learning; Systems engineering and theory; Training data;
fLanguage
English
Publisher
ieee
Conference_Titel
Computer Science Conference, 2000. ACSC 2000. 23rd Australasian
Conference_Location
Canberra, ACT
Print_ISBN
0-7695-0518-X
Type
conf
DOI
10.1109/ACSC.2000.824407
Filename
824407
Link To Document