DocumentCode :
1818065
Title :
An estimation of sample complexity of the neural network model in the extended PAC learning framework
Author :
Miyake, Shigeki ; Kanaya, Fumio
Author_Institution :
NTT Transmission Syst. Lab., Kanagawa, Japan
Volume :
1
fYear :
1992
fDate :
7-11 Jun 1992
Firstpage :
607
Abstract :
L.G. Valiant´s (1984) PAC (probably approximately correct) learning framework is extended to be applicable to the statistical decision theoretic problem setting. In this general setting, successful learning means finding a hypothesis or a decision rule that attains risk that is as close as possible to Bayes optimal risk. It is then shown that the generalization ability of a learning model in the extended PAC framework can be consistently considered from the Bayes risk consistency viewpoint. Finally, as an application of the extended PAC framework, an upper bound on the sample complexity of a neural network model is obtained. The upper bound is equal in the order of the PAC learning parameters ε and δ to the one that D. Haussler obtained (1989). However, the lower bound on sample complexity and the tightness of these upper bounds are remaining problems
Keywords :
Bayes methods; computational complexity; decision theory; learning (artificial intelligence); neural nets; Bayes optimal risk; PAC learning framework; neural network; sample complexity; statistical decision theoretic problem; Artificial neural networks; Estimation theory; Function approximation; Intelligent networks; Laboratories; Neural networks; Noise generators; Probability; Upper bound; Virtual colonoscopy;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1992. IJCNN., International Joint Conference on
Conference_Location :
Baltimore, MD
Print_ISBN :
0-7803-0559-0
Type :
conf
DOI :
10.1109/IJCNN.1992.287145
Filename :
287145
Link To Document :
بازگشت