DocumentCode :
3494995
Title :
PAC learnability versus VC dimension: A footnote to a basic result of statistical learning
Author :
Pestov, Vladimir
Author_Institution :
Dept. de Mat., Univ. Fed. de Santa Catarina, Florianopolis, Brazil
fYear :
2011
fDate :
July 31 2011-Aug. 5 2011
Firstpage :
1141
Lastpage :
1145
Abstract :
A fundamental result of statistical learning theory states that a concept class is PAC learnable if and only if it is a uniform Glivenko-Cantelli class if and only if the VC dimension of the class is finite. However, the theorem is only valid under special assumptions of measurability of the class, in which case the PAC learnability even becomes consistent. Otherwise, there is a classical example, constructed under the Continuum Hypothesis by Dudley and Durst and further adapted by Blumer, Ehrenfeucht, Haussler, and Warmuth, of a concept class of VC dimension one which is neither uniform Glivenko-Cantelli nor consistently PAC learnable. We show that, rather surprisingly, under an additional set-theoretic hypothesis which is much milder than the Continuum Hypothesis (Martin´s Axiom), PAC learnability is equivalent to finite VC dimension for every concept class.
Keywords :
learning (artificial intelligence); set theory; statistical analysis; PAC learnability; VC dimension; continuum hypothesis; set-theoretic hypothesis; statistical learning theory; uniform Glivenko-Cantelli class; Atmospheric measurements; Complexity theory; Educational institutions; Particle measurements; Presses; Set theory; Statistical learning;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks (IJCNN), The 2011 International Joint Conference on
Conference_Location :
San Jose, CA
ISSN :
2161-4393
Print_ISBN :
978-1-4244-9635-8
Type :
conf
DOI :
10.1109/IJCNN.2011.6033352
Filename :
6033352
Link To Document :
بازگشت