Title :
A metric entropy bound is not sufficient for learnability
Author :
Kulkarni, Sanjeev R. ; Richardson, Tom ; Zeitouni, O.
fDate :
5/1/1994 12:00:00 AM
Abstract :
The authors prove by means of a counterexample that it is not sufficient, for probably approximately correct (PAC) learning under a class of distributions, to have a uniform bound on the metric entropy of the class of concepts to be learned. This settles a conjecture of Benedek and Itai (1991)
Keywords :
entropy; information theory; learning (artificial intelligence); probability; learnability; metric entropy bound; probably approximately correct learning; uniform bound; Algebra; Entropy; Extraterrestrial measurements; Intelligent control; Mathematics; Notice of Violation; Random variables; Sufficient conditions;
Journal_Title :
Information Theory, IEEE Transactions on