DocumentCode
1809486
Title
A new framework for modeling learning dynamics
Author
Tong, Y.W. ; Wong, K. Y Michael ; Li, S.
Author_Institution
Dept. of Phys., Hong Kong Univ. of Sci. & Technol., Kowloon, Hong Kong
Volume
2
fYear
1999
fDate
36342
Firstpage
1164
Abstract
An important issue in neural computing concerns the description of learning dynamics with macroscopic dynamical variables. Recent progress on online learning only addresses the often unrealistic case of an infinite training set. We introduce a new framework to model batch learning of restricted sets of examples, widely applicable to any learning cost function, and fully taking into account the temporal correlations introduced by the re-cycling of the examples. Here we illustrate the technique using the Adaline rule learning random of teacher-generated examples
Keywords
gradient methods; learning (artificial intelligence); neural nets; real-time systems; Adaline rule; batch learning; cost function; learning dynamics; neural networks; online learning; temporal correlations; Algorithm design and analysis; Cost function; Hebbian theory; Iterative algorithms; Joining processes; Microscopy; Physics; Recycling;
fLanguage
English
Publisher
ieee
Conference_Titel
Neural Networks, 1999. IJCNN '99. International Joint Conference on
Conference_Location
Washington, DC
ISSN
1098-7576
Print_ISBN
0-7803-5529-6
Type
conf
DOI
10.1109/IJCNN.1999.831123
Filename
831123
Link To Document