Abstract :
The learning of temporal structures, e.g. limit cycles, by recurrent neural networks has recently received considerable attention. Unfortunately, some of the algorithms proposed so far are of high storage complexity. Others, being extensions of the Hopfield model, have quite limited storage capacity. Generalizing the work of Doya and Yoshizawa (1989) as well as Urbanczik (1990), the author derives an algorithm for training network of arbitrary connectivity with hidden units. The algorithm requires O(n)-storage in addition to the weight matrix. Numerical simulations, pertaining to the learning of limit cycles and to the modelling of Markov chains, show that quite complex temporal behaviour can be trained by this method