DocumentCode
288502
Title
Training recurrent neural networks with temporal input encodings
Author
Omlin, C.W. ; Giles, C.L. ; Horne, B.G. ; Leerink, L.R. ; Lin, T.
Author_Institution
NEC Res. Inst., Princeton, NJ, USA
Volume
2
fYear
1994
fDate
27 Jun-2 Jul 1994
Firstpage
1267
Abstract
Investigates the learning of deterministic finite-state automata (DFAs) with recurrent networks with a single input neuron, where each input symbol is represented as a temporal pattern and strings as sequences of temporal patterns. The authors empirically demonstrate that obvious temporal encodings can make learning very difficult or even impossible. Based on preliminary results, the authors formulate some hypotheses about `good´ temporal encoding, i.e. encodings which do not significantly increase training time compared to training of networks with multiple input neurons
Keywords
deterministic automata; finite automata; learning (artificial intelligence); recurrent neural nets; deterministic finite-state automata; learning; recurrent neural networks; sequences; strings; temporal input encodings; temporal pattern; Biological information theory; Doped fiber amplifiers; Educational institutions; Encoding; Equations; Learning automata; National electric code; Neurons; Recurrent neural networks; Signal design;
fLanguage
English
Publisher
ieee
Conference_Titel
Neural Networks, 1994. IEEE World Congress on Computational Intelligence., 1994 IEEE International Conference on
Conference_Location
Orlando, FL
Print_ISBN
0-7803-1901-X
Type
conf
DOI
10.1109/ICNN.1994.374366
Filename
374366
Link To Document