next up previous
次へ: Acknowledgments 上へ: Reducing Memory Requirements and 戻る: Second-order Ergodic HMM

Conclusion

This paper discussed a large-state discrete Ergodic Hidden Markov Model as a stochastic network language model for language modeling. It proposed new techniques to reduce the memory requirements and computational costs associated with the Baum-Welch algorithm.

These techniques were evaluated using an international conference registration task by computing the perplexity. In addition, an Ergodic HMM was investigated for use in language modeling for continuous speech recognition. These results obtained compared favorably to those of word bigram models. This may mean that the proposed techniques are effective.



Jin'ichi Murakami 平成13年1月19日