• DocumentCode
    312012
  • Title

    Entropy coded vector quantization with hidden Markov models

  • Author

    Yonezaki, T. ; Shikano, Kiyohiro

  • Author_Institution
    Telecom Res. Lab., Matsushita Commun. Ind. Co. Ltd., Yokohama, Japan
  • Volume
    1
  • fYear
    1996
  • fDate
    3-6 Oct 1996
  • Firstpage
    310
  • Abstract
    The authors propose a new vector quantization approach, which consists of hidden Markov models (HMMs) and an entropy coding scheme. The entropy coding system is determined depending on the speech status modeled by HMMs, so the proposed approach can adaptively allocate suitable numbers of bits to the codewords. This approach realizes about 0.3[dB] coding gain in cepstrum distance (8 state HMMs). In other words, an 8 bit codebook is represented by about 6.5 bits for average code length. They also research for robustness to the channel error. HMMs and the entropy coding system, which seem to be weak to the channel error, are augmented to be robust, so that the influence of the channel error is decreased into one-third
  • Keywords
    Huffman codes; cepstral analysis; entropy codes; hidden Markov models; speech coding; vector quantisation; adaptive bit allocation; cepstrum distance; channel error; codewords; coding gain; entropy coded vector quantization; entropy coding scheme; entropy coding system; hidden Markov models; speech status; Books; Cepstrum; Entropy coding; Hidden Markov models; Huffman coding; Probability; Robustness; Speech coding; Telecommunications; Vector quantization;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Spoken Language, 1996. ICSLP 96. Proceedings., Fourth International Conference on
  • Conference_Location
    Philadelphia, PA
  • Print_ISBN
    0-7803-3555-4
  • Type

    conf

  • DOI
    10.1109/ICSLP.1996.607115
  • Filename
    607115