DocumentCode :
2288731
Title :
A neural network model for acquisition of semantic structures
Author :
Chan, Samuel W K ; Franklin, James
Author_Institution :
Sch. of Comput. Sci. & Eng., New South Wales Univ., NSW, Australia
fYear :
1994
fDate :
13-16 Apr 1994
Firstpage :
221
Abstract :
Research suggests that natural language processing (NLP) can be profitably viewed in terms of the spread of activation through a neural network. However, since the critique by Fodor (1988) of the style of connectionist representations, one of the biggest challenges facing proponents of connectionist models of NLP is the rich structures of language. As models of NLP, neural network systems must exhibit the properties of compositionality and structure sensitivity. This paper describes a neural network model in which simple recurrent network and recursive auto-association memory are combined to acquire the semantic structures from sentence constituents. This imposes no prior limit on sentence structures. The model can be viewed as a tool of conceptual acquisition and generalization extraction in language understanding
Keywords :
grammars; learning (artificial intelligence); natural languages; recurrent neural nets; speech analysis and processing; speech recognition; connectionist models; connectionist representations; language understanding; natural language processing; neural network model; neural network systems; recurrent network; recursive auto-association memory; semantic structures acquisition; sentence constituents; Artificial neural networks; Australia; Biological neural networks; Computer science; Hardware; Humans; Mathematics; Natural language processing; Neural networks; Recurrent neural networks;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Speech, Image Processing and Neural Networks, 1994. Proceedings, ISSIPNN '94., 1994 International Symposium on
Print_ISBN :
0-7803-1865-X
Type :
conf
DOI :
10.1109/SIPNN.1994.344927
Filename :
344927
Link To Document :
بازگشت