• DocumentCode
    285222
  • Title

    Is the distribution-free sample bound for generalization tight?

  • Author

    Ji, Chuanyi

  • Author_Institution
    Rensselaer Polytech. Inst., Troy, NY, USA
  • Volume
    3
  • fYear
    1992
  • fDate
    7-11 Jun 1992
  • Firstpage
    954
  • Abstract
    A general relationship is developed between the two sharp transition points, the statistical capacity which represents the memorization, and the universal sample bound for generalization, for a network composed of random samples drawn from a specific class of distributions. This relationship indicates that generalization happens after memorization. It is shown through one example that the sample complexity needed for generalization can coincide with the capacity point. For the worst case, the sample complexity for generalization can be on the order of the distribution-free bound, whereas, for a more structured case, it can be smaller than the worst case bound. The analysis sheds light on why in practice the number of samples needed for generalization can be smaller than the bound given in term of the VC-dimension
  • Keywords
    computational complexity; generalisation (artificial intelligence); neural nets; VC-dimension; complexity; distribution-free sample bound; generalization; memorization; neural networks; random samples; statistical capacity; universal sample bound; worst case bound; Computer errors; Computer networks; Distributed computing; Error analysis; Neurons; Probability; Random variables; Systems engineering and theory;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 1992. IJCNN., International Joint Conference on
  • Conference_Location
    Baltimore, MD
  • Print_ISBN
    0-7803-0559-0
  • Type

    conf

  • DOI
    10.1109/IJCNN.1992.227076
  • Filename
    227076