DocumentCode :
275923
Title :
Learning temporal structures by continuous backpropagation
Author :
Urbanczik, R.
Author_Institution :
Basel Univ., Switzerland
fYear :
1991
fDate :
18-20 Nov 1991
Firstpage :
124
Lastpage :
128
Abstract :
The learning of temporal structures, e.g. limit cycles, by recurrent neural networks has recently received considerable attention. Unfortunately, some of the algorithms proposed so far are of high storage complexity. Others, being extensions of the Hopfield model, have quite limited storage capacity. Generalizing the work of Doya and Yoshizawa (1989) as well as Urbanczik (1990), the author derives an algorithm for training network of arbitrary connectivity with hidden units. The algorithm requires O(n)-storage in addition to the weight matrix. Numerical simulations, pertaining to the learning of limit cycles and to the modelling of Markov chains, show that quite complex temporal behaviour can be trained by this method
Keywords :
computational complexity; learning systems; neural nets; continuous backpropagation; recurrent neural networks; storage complexity; temporal structures; training network; weight matrix;
fLanguage :
English
Publisher :
iet
Conference_Titel :
Artificial Neural Networks, 1991., Second International Conference on
Conference_Location :
Bournemouth
Print_ISBN :
0-85296-531-1
Type :
conf
Filename :
140300
Link To Document :
بازگشت