DocumentCode :
2558609
Title :
Softmax-margin training for statistical machine translation
Author :
Zhang, Wenwen ; Liu, Lemao ; Cao, Hailong ; Zhao, Tiejun
Author_Institution :
MOE-MS Key Lab. of Natural Language Process. & Speech, Harbin Inst. of Technol., Harbin, China
fYear :
2012
fDate :
29-31 May 2012
Firstpage :
838
Lastpage :
842
Abstract :
The training procedure is very important in statistical machine translation (SMT). It has a great influence on the final performance of a translation system. The widely used method in SMT is the minimum error rate training (MERT). It is effective to estimate the feature function weights. However, MERT does not use regularization and has been observed to over-fit. In this paper, we describe a method named softmax-margin, which is a modification of the max-margin training. This approach is simple, efficient, and easy to implement. We conduct our work using data sets from the WMT shared tasks. The results of experiment on small scale French-English translation task reach a competitive performance compared to MERT.
Keywords :
language translation; statistical analysis; French-English translation; feature function weights; final performance; minimum error rate training; softmax-margin training; statistical machine translation; training procedure; translation system; Cost function; Error analysis; Gold; NIST; Probabilistic logic; Training; MERT; Softmax-margin; Statistical Machine Translation;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Natural Computation (ICNC), 2012 Eighth International Conference on
Conference_Location :
Chongqing
ISSN :
2157-9555
Print_ISBN :
978-1-4577-2130-4
Type :
conf
DOI :
10.1109/ICNC.2012.6234638
Filename :
6234638
Link To Document :
بازگشت