DocumentCode :
1748957
Title :
Dynamical assessment of symbolic processes with backprop nets
Author :
Tabor, Whitney
Author_Institution :
Dept. of Psychol., Connecticut Univ., Storrs, CT, USA
Volume :
4
fYear :
2001
fDate :
2001
Firstpage :
2838
Abstract :
Simple recurrent networks were trained to predict the outputs of various probabilistic symbolic process. Two of the symbolic processes made critical use of a push-down stack, and two were finite state Markov processes. The memory-intensive (stack) processes, in contrast to the Markov processes, pushed the largest Lyapunov exponents toward zero, although they never reached zero. The growth in the Lyapunov exponents was conditioned by the memory-intensiveness of thetas, not by the growth rate of the states. The results indicate a link between the traditional use of stack-memories to create complex computation and dynamical treatments of complexity based on trajectory divergence
Keywords :
Lyapunov methods; Markov processes; backpropagation; content-addressable storage; recurrent neural nets; symbol manipulation; Lyapunov exponents; Markov processes; backpropagation; memory-intensive processes; probabilistic symbolic process; recurrent neural networks; stack-memory; symbolic processes; Artificial neural networks; Backpropagation; Computer networks; Eigenvalues and eigenfunctions; Fractals; Jacobian matrices; Markov processes; Neural networks; Nonlinear systems; Psychology;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 2001. Proceedings. IJCNN '01. International Joint Conference on
Conference_Location :
Washington, DC
ISSN :
1098-7576
Print_ISBN :
0-7803-7044-9
Type :
conf
DOI :
10.1109/IJCNN.2001.938826
Filename :
938826
Link To Document :
بازگشت