next up previous
次へ: Acknowledgments 上へ: main1 戻る: Second-order Ergodic HMM

Conclusion

This paper discussed a large-state discrete Ergodic Hidden Markov Model as a stochastic network language model for language modeling. It proposed new techniques to reduce the memory requirements and computational costs associated with the Baum-Welch algorithm.

These techniques were evaluated using an international conference registration task by computing the perplexity. In addition, an Ergodic HMM was investigated for use in language modeling for continuous speech recognition. These results obtained compared favorably to those of word bigram models. This means that the proposed techniques are effective.



Jin'ichi Murakami 平成13年10月2日