DocumentCode :
1748920
Title :
Learning context-free grammars with recurrent neural networks
Author :
Harada, Tetsuji ; Araki, Osamu ; Sakurai, Akito
Author_Institution :
Graduate Sch. of Knowledge Sci., Japan Adv. Inst. of Sci. & Technol., Ishikawa, Japan
Volume :
4
fYear :
2001
fDate :
2001
Firstpage :
2602
Abstract :
The primary purpose of this work is to construct a recurrent neural network (RNN) architecture that learns context-free grammars (CFG) with recursive rules, intending to get some insights for human language acquisition. Specifically, we are interested in how RNN can learn recursive rules. The models proposed here constructed with two promising connectionist techniques, recursive auto-associative memory (RAAM) and simple recurrent network (SRN). RAAM learns to represent parse trees as real valued vectors, and SRN learns to parse sentences. We investigated if the RAAM/SRN model can learn to parse a language {a nbn|n⩾1}, and two other languages generated by simple CFG with recursively embedded phrases
Keywords :
content-addressable storage; context-free grammars; learning (artificial intelligence); recurrent neural nets; CFG; RAAM; RNN; SRN; connectionist techniques; context-free grammar learning; human language acquisition; parse trees; real-valued vectors; recurrent neural network architecture construction; recursive auto-associative memory; recursive rules; recursively embedded phrases; simple recurrent network; Artificial neural networks; Backpropagation; Computer networks; Embedded computing; Humans; Mars; Natural languages; Neural networks; Recurrent neural networks; Robustness;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 2001. Proceedings. IJCNN '01. International Joint Conference on
Conference_Location :
Washington, DC
ISSN :
1098-7576
Print_ISBN :
0-7803-7044-9
Type :
conf
DOI :
10.1109/IJCNN.2001.938780
Filename :
938780
Link To Document :
بازگشت