• DocumentCode
    2330705
  • Title

    Topic dependent class based language model evaluation on automatic speech recognition

  • Author

    Naptali, Welly ; Tsuchiya, Masatoshi ; Nakagawa, Seiichi

  • Author_Institution
    Dept. of Comput. Sci. & Eng., Toyohashi Univ. of Technol., Toyohashi, Japan
  • fYear
    2010
  • fDate
    12-15 Dec. 2010
  • Firstpage
    395
  • Lastpage
    400
  • Abstract
    A topic dependent class (TDC) language model (LM) is a topic-based LM that uses a semantic extraction method to reveal latent topic information from noun-document relation. Then a clustering for a given context is performed to define topics. Finally, a fixed window of word history is observed to decide the topic of the current event through voting in online manner. Previously, we have shown that TDC overperforms several state-of-the-art baselines in terms of perplexity. In this paper we evaluate TDC on automatic speech recognition experiment (ASR) for rescoring task. Experiments on read speech Wall Street Journal (English ASR system) and Mainichi Shimbun (Japanese ASR system) show that TDC LM improves both perplexity and word-error-rate (WER). The result shows that the proposed model gives improvements 3.0% relative on perplexity and 15.2% relative on WER for English ASR system, and 16.4% relative on perplexity and 24.3% relative on WER for Japanese ASR system.
  • Keywords
    entropy; feature extraction; natural language processing; pattern classification; speech recognition; Mainichi Shimbun; Wall Street Journal; perplexity; semantic extraction; speech recognition; topic dependent class language model; word error rate; language model; speech recognition; topic dependent;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Spoken Language Technology Workshop (SLT), 2010 IEEE
  • Conference_Location
    Berkeley, CA
  • Print_ISBN
    978-1-4244-7904-7
  • Electronic_ISBN
    978-1-4244-7902-3
  • Type

    conf

  • DOI
    10.1109/SLT.2010.5700885
  • Filename
    5700885