Title :
Topic model with constrainted word burstiness intensities
Author :
Lei, Shaoze ; Zhang, JianWen ; Weng, Shifeng ; Zhang, Changshui
Author_Institution :
Dept. of Autom. Eng., Tsinghua Univ., Beijing, China
fDate :
July 31 2011-Aug. 5 2011
Abstract :
Word burstiness phenomenon, which means that if a word occurs once in a document it is likely to occur repeatedly, has interested the text analysis field recently. Dirichlet Compound Multinomial Latent Dirichlet Allocation (DCMLDA) introduces this word burstiness mechanism into Latent Dirichlet Allocation (LDA). However, in DCMLDA, there is no restriction on the word burstiness intensity of each topic. Consequently, as shown in this paper, the burstiness intensities of words in major topics will become extremely low and the topics´ ability to represent different semantic meanings will be impaired. In order to get topics that represent semantic meanings of documents well, we introduce constraints on topics´ word burstiness intensities. Experiments demonstrate that DCMLDA with constrained word burstiness intensities achieves better performance than the original one without constraints. Besides, these additional constraints help to reveal the relationship between two key properties inherited from DCM and LDA respectively. These two properties have a great influence on the combined model´s performance and their relationship revealed by this paper is an important guidance for further study of topic models.
Keywords :
statistical analysis; text analysis; word processing; DCMLDA; Dirichlet compound multinomial Latent Dirichlet allocation; constrained word burstiness intensities; semantic meaning; text analysis; topic model; Compounds; Educational institutions; Inference algorithms; Monte Carlo methods; Optimization; Resource management; Semantics;
Conference_Titel :
Neural Networks (IJCNN), The 2011 International Joint Conference on
Conference_Location :
San Jose, CA
Print_ISBN :
978-1-4244-9635-8
DOI :
10.1109/IJCNN.2011.6033201