DocumentCode :
3165850
Title :
Creating ensemble of diverse maximum entropy models
Author :
Audhkhasi, Kartik ; Sethy, Abhinav ; Ramabhadran, Bhuvana ; Narayanan, Shrikanth S.
Author_Institution :
Signal Anal. & Interpretation Lab. (SAIL), Univ. of Southern California, Los Angeles, CA, USA
fYear :
2012
fDate :
25-30 March 2012
Firstpage :
4845
Lastpage :
4848
Abstract :
Diversity of a classifier ensemble has been shown to benefit overall classification performance. But most conventional methods of training ensembles offer no control on the extent of diversity and are meta-learners. We present a method for creating an ensemble of diverse maximum entropy (∂MaxEnt) models, which are popular in speech and language processing. We modify the objective function for conventional training of a MaxEnt model such that its output posterior distribution is diverse with respect to a reference model. Two diversity scores are explored - KL divergence and posterior cross-correlation. Experiments on the CoNLL-2003 Named Entity Recognition task and the IEMOCAP emotion recognition database show the benefits of a ∂MaxEnt ensemble.
Keywords :
maximum entropy methods; speech processing; ∂MaxEnt ensemble; CoNLL-2003 named entity recognition task; IEMOCAP emotion recognition database; KL divergence; MaxEnt model; conventional training; diverse maximum entropy models; language processing; meta-learners; objective function; posterior cross-correlation; speech processing; Bagging; Data models; Databases; Entropy; Linear programming; Optimization; Training; Maximum entropy model; classifier diversity;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Acoustics, Speech and Signal Processing (ICASSP), 2012 IEEE International Conference on
Conference_Location :
Kyoto
ISSN :
1520-6149
Print_ISBN :
978-1-4673-0045-2
Electronic_ISBN :
1520-6149
Type :
conf
DOI :
10.1109/ICASSP.2012.6289004
Filename :
6289004
Link To Document :
بازگشت