Title :
Neural network joint modeling via context-dependent projection
Author :
Yik-Cheung Tam ; Yun Lei
Author_Institution :
Speech Technol. & Res. Lab., SRI Int., Menlo Park, CA, USA
Abstract :
Neural network joint modeling (NNJM) has produced huge improvement in machine translation performance. As in standard neural network language modeling, a context-independent linear projection is applied to project a sparse input vector into a continuous representation at each word position. Because neighboring words are dependent on each other, context-independent projection may not be optimal. We propose a context-dependent linear projection approach which considers neighboring words. Experimental results showed that the proposed approach further improves NNJM by 0.5 BLEU for English-Iraqi Arabic translation in N-best rescoring. Compared to a baseline using hierarchical phrases and sparse features, NNJM with our proposed approach has achieved a 2 BLEU improvement.
Keywords :
language translation; neural nets; speech recognition; English-Iraqi Arabic translation; N-best rescoring; context-independent linear projection; machine translation; neural network joint modeling; sparse input vector; Artificial neural networks; History; Pragmatics; Syntactics; Neural network joint modeling; context-dependent linear projection; position-dependent linear projection; statistical machine translation;
Conference_Titel :
Acoustics, Speech and Signal Processing (ICASSP), 2015 IEEE International Conference on
Conference_Location :
South Brisbane, QLD
DOI :
10.1109/ICASSP.2015.7178994