Title : 
On the Stability of Empirical Risk Minimization in the Presence of Multiple Risk Minimizers
         
        
            Author : 
Rubinstein, Benjamin I P ; Simma, Aleksandr
         
        
            Author_Institution : 
Microsoft Res. Silicon Valley, Mountain View, CA, USA
         
        
        
        
        
            fDate : 
7/1/2012 12:00:00 AM
         
        
        
        
            Abstract : 
Recently, Kutin and Niyogi investigated several notions of algorithmic stability-a property of a learning map conceptually similar to continuity-showing that training stability is sufficient for consistency of empirical risk minimization (ERM) while distribution-free CV-stability is necessary and sufficient for having finite VC-dimension. This paper concerns a phase transition in the training stability of ERM, conjectured by the same authors. Kutin and Niyogi proved that ERM on finite hypothesis spaces containing a unique risk minimizer has training stability that scales exponentially with sample size, and conjectured that the existence of multiple risk minimizers prevents even super-quadratic convergence. We prove this result for the strictly weaker notion of CV-stability, positively resolving the conjecture.
         
        
            Keywords : 
learning (artificial intelligence); stability; ERM; algorithmic stability; distribution-free CV-stability; empirical risk minimization stability; finite VC-dimension; finite hypothesis spaces; learning map; multiple risk minimizers; phase transition; superquadratic convergence; training stability; Educational institutions; Learning systems; Risk management; Silicon; Stability criteria; Training; Algorithmic stability; empirical risk minimization (ERM); threshold phenomena;
         
        
        
            Journal_Title : 
Information Theory, IEEE Transactions on
         
        
        
        
        
            DOI : 
10.1109/TIT.2012.2191681