Title of article :
Learning exponential state-growth languages by hill climbing
Author/Authors :
W.، Tabor, نويسنده ,
Issue Information :
روزنامه با شماره پیاپی سال 2003
Pages :
-443
From page :
444
To page :
0
Abstract :
Training recurrent neural networks on infinite state languages has been successful with languages in which the minimal number of machine states grows linearly with the length of the sentence, but has faired poorly with exponential state-growth languages. The new architecture learns several exponential state-growth languages in near perfect by hill climbing.
Keywords :
Learning capability , neural-network modularity , Storage capacity , two-hidden-layer feedforward networks (TLFNs)
Journal title :
IEEE TRANSACTIONS ON NEURAL NETWORKS
Serial Year :
2003
Journal title :
IEEE TRANSACTIONS ON NEURAL NETWORKS
Record number :
62825
Link To Document :
بازگشت