next up previous
次へ: 文献目録 上へ: The Possibility for Acquisition 戻る: Sentence speech recognition for

Conclusion

In this paper, we discussed a an discrete Ergodic Hidden Markov Model for a language model, because an Ergodic HMM and a stochastic network grammar have the same structure and same parameters.

We found that after training of Baum-Welch learning for a discrete Ergodic HMM using many text data, the word groupings in the HMM state show a striking similarity to POSs. This means that an Ergodic HMM has the ability to automatically acquire both a stochastic network grammar and the concept of POS simultaneously from (POS-untagged) training data.

In addition, we investigated an Ergodic Hidden Markov model as a language model for sentence speech recognition; we found that it is effective as a language model. Therefore, it may be said that an Ergodic HMM is a general learning language model.



Jin'ichi Murakami 平成13年1月19日