DocumentCode :
1205677
Title :
A metric entropy bound is not sufficient for learnability
Author :
Kulkarni, Sanjeev R. ; Richardson, Tom ; Zeitouni, O.
Volume :
40
Issue :
3
fYear :
1994
fDate :
5/1/1994 12:00:00 AM
Firstpage :
883
Lastpage :
885
Abstract :
The authors prove by means of a counterexample that it is not sufficient, for probably approximately correct (PAC) learning under a class of distributions, to have a uniform bound on the metric entropy of the class of concepts to be learned. This settles a conjecture of Benedek and Itai (1991)
Keywords :
entropy; information theory; learning (artificial intelligence); probability; learnability; metric entropy bound; probably approximately correct learning; uniform bound; Algebra; Entropy; Extraterrestrial measurements; Intelligent control; Mathematics; Notice of Violation; Random variables; Sufficient conditions;
fLanguage :
English
Journal_Title :
Information Theory, IEEE Transactions on
Publisher :
ieee
ISSN :
0018-9448
Type :
jour
DOI :
10.1109/18.335898
Filename :
335898
Link To Document :
بازگشت