Title : 
Lower bounds for empirical and leave-one-out estimates of the generalization error
         
        
            Author : 
Gavin, G. ; Teytaud, O.
         
        
            Author_Institution : 
ERIC, Lyon Univ., Mendes, France
         
        
        
        
        
        
            Abstract : 
Usually re-sampling estimates are considered more efficient to estimate the generalization performances than the empirical error. In this paper we consider the leave one out estimate. We show that in the previous framework, it is not better than the empirical error. Moreover, we show that sometimes training error estimate is more efficient. The paper summarizes the framework of machine learning, defines the sample complexity, and recalls some usual results
         
        
            Keywords : 
computational complexity; generalisation (artificial intelligence); learning (artificial intelligence); learning systems; probability; generalization; learning error; leave-one-out estimates; lower bounds; machine learning; probability; sample complexity; Art; Frequency; Learning systems; Machine learning; Probability distribution; Statistical learning; Sufficient conditions; Upper bound;
         
        
        
        
            Conference_Titel : 
Neural Networks, 2001. Proceedings. IJCNN '01. International Joint Conference on
         
        
            Conference_Location : 
Washington, DC
         
        
        
            Print_ISBN : 
0-7803-7044-9
         
        
        
            DOI : 
10.1109/IJCNN.2001.939538