Title :
A Note on Sample Complexity of Learning Binary Output Neural Networks under Fixed Input Distributions
Author :
Pestov, Vladimir
Author_Institution :
Dept. of Math. & Stat., Univ. of Ottawa, Ottawa, ON, Canada
Abstract :
We show that the learning sample complexity of a sigmoidal neural network constructed by Sontag (1992) required to achieve a given misclassification error under a fixed purely atomic distribution can grow arbitrarily fast: for any prescribed rate of growth there is an input distribution having this rate as the sample complexity, and the bound is asymptotically tight. The rate can be super exponential, a non-recursive function, etc. We further observe that Sontag´s ANN is not Glivenko-Cantelli under any input distribution having a non-atomic part.
Keywords :
learning (artificial intelligence); neural nets; atomic distribution; fixed input distributions; learning; misclassification error; sample complexity; sigmoidal neural network; Accuracy; Artificial neural networks; Atomic measurements; Complexity theory; Extraterrestrial measurements; Loss measurement; Probability distribution; PAC learnability; Sontag´s ANN; fixed distribution learning; infinite VC dimension; precompactness; sample complexity; witness of irregularity;
Conference_Titel :
Neural Networks (SBRN), 2010 Eleventh Brazilian Symposium on
Conference_Location :
Sao Paulo
Print_ISBN :
978-1-4244-8391-4
Electronic_ISBN :
1522-4899
DOI :
10.1109/SBRN.2010.10