Title :
Incremental sentence compression using LSTM recurrent networks
Author :
Sakriani Sakti;Faiz Ilham;Graham Neubig;Tomoki Toda;Ayu Purwarianti;Satoshi Nakamura
Author_Institution :
Graduate School of Information Science, Nara Institute of Science and Technology, Japan
Abstract :
Many of the current sentence compression techniques attempt to produce a shortened form of a sentence by relying on syntactic structure such as dependency tree representations. While the performance of sentence compression has been improving, these approaches require a full parse of the sentence before performing sentence compression, making it difficult to perform compression in real time. In this paper, we examine the possibilities of performing incremental sentence compression using long short-term memory (LSTM) recurrent neural networks (RNN). The decision of whether to remove a word is done at each time step, without waiting for the end of the sentence. Various RNN parameters are investigated, including the number of layers and network connections. Furthermore, we also propose using a pretraining method in which the network is pretrained as an autoencoder. Experimental results reveal that our method obtains compression rates similar to human references and a better accuracy than the state-of-the-art tree transduction models.
Keywords :
"Speech","Recurrent neural networks","Training","Real-time systems","Logic gates","Training data","Speech recognition"
Conference_Titel :
Automatic Speech Recognition and Understanding (ASRU), 2015 IEEE Workshop on
DOI :
10.1109/ASRU.2015.7404802