Title :
Perspectives from the informational complexity of learning
Author_Institution :
Lucent Technol. Bell Labs., Murray Hill, NJ, USA
Abstract :
We discuss two seemingly disparate problems of learning from examples within the framework of statistical learning theory. The first involves real-valued function learning using neural networks and an analysis of this has two interesting aspects (1) it shows how the generalization ability of a learner is bounded both by finite data and limited representational capacity (2) it shifts attention away from asymptotics to learning with finite resources. The perspective that this yields is then brought to bear on the second problem of learning natural language grammars to articulate some issues that computational linguistics needs to deal with
Keywords :
generalisation (artificial intelligence); grammars; learning by example; natural languages; neural nets; asymptotics; computational linguistics; finite data; finite resources; generalization ability; informational complexity; learning from examples; limited representational capacity; natural language grammars; neural networks; real-valued function learning; statistical learning theory; Algorithm design and analysis; Computational complexity; Computational linguistics; Computer science; Convergence; Explosions; Multilayer perceptrons; Natural languages; Neural networks; Statistics;
Conference_Titel :
Circuits and Systems, 2000. Proceedings. ISCAS 2000 Geneva. The 2000 IEEE International Symposium on
Conference_Location :
Geneva
Print_ISBN :
0-7803-5482-6
DOI :
10.1109/ISCAS.2000.856046