Title :
An Inequality for Nearly Log-Concave Distributions With Applications to Learning
Author :
Caramanis, Constantic ; Mannor, Shie
Author_Institution :
Dept. of Electr. & Comput. Eng., Univ. of Texas, Austin, TX
fDate :
3/1/2007 12:00:00 AM
Abstract :
We prove that given a nearly log-concave distribution, in any partition of the space to two well separated sets, the measure of the points that do not belong to these sets is large. We apply this isoperimetric inequality to derive lower bounds on the generalization error in learning. We further consider regression problems and show that if the inputs and outputs are sampled from a nearly log-concave distribution, the measure of points for which the prediction is wrong by more than epsi0 and less than epsi1 is (roughly) linear in epsi1-epsi0, as long as epsi0 is not too small, and epsi1 not too large. We also show that when the data are sampled from a nearly log-concave distribution, the margin cannot be large in a strong probabilistic sense
Keywords :
learning (artificial intelligence); probability; regression analysis; sampling methods; set theory; data sampling; isoperimetric inequality; learning; log-concave distribution; probability; regression problem; Books; Computer science; Councils; Density measurement; Machine learning; Machine learning algorithms; Pattern recognition; Sampling methods; Statistical learning; Testing; Classification; generalization error; margin; statistical learning theory;
Journal_Title :
Information Theory, IEEE Transactions on
DOI :
10.1109/TIT.2006.890699