DocumentCode :
179033
Title :
Recurrent conditional random field for language understanding
Author :
Kaisheng Yao ; Baolin Peng ; Zweig, Geoffrey ; Dong Yu ; Xiaolong Li ; Feng Gao
fYear :
2014
fDate :
4-9 May 2014
Firstpage :
4077
Lastpage :
4081
Abstract :
Recurrent neural networks (RNNs) have recently produced record setting performance in language modeling and word-labeling tasks. In the word-labeling task, the RNN is used analogously to the more traditional conditional random field (CRF) to assign a label to each word in an input sequence, and has been shown to significantly outperform CRFs. In contrast to CRFs, RNNs operate in an online fashion to assign labels as soon as a word is seen, rather than after seeing the whole word sequence. In this paper, we show that the performance of an RNN tagger can be significantly improved by incorporating elements of the CRF model; specifically, the explicit modeling of output-label dependencies with transition features, its global sequence-level objective function, and offline decoding. We term the resulting model a “recurrent conditional random field” and demonstrate its effectiveness on the ATIS travel domain dataset and a variety of web-search language understanding datasets.
Keywords :
Internet; natural language processing; recurrent neural nets; CRF; RNN; RNN tagger; Web search language understanding datasets; input sequence; language modeling; language understanding; output label dependencies; recurrent conditional random field; recurrent neural networks; transition features; word labeling tasks; word sequence; Computational modeling; Linear programming; Motion pictures; Recurrent neural networks; Speech; Training; Conditional random fields; recurrent neural networks;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Acoustics, Speech and Signal Processing (ICASSP), 2014 IEEE International Conference on
Conference_Location :
Florence
Type :
conf
DOI :
10.1109/ICASSP.2014.6854368
Filename :
6854368
Link To Document :
بازگشت