next up previous
次へ: Entropy for the Ergodic 上へ: Stochastic Network Grammar Acquisition 戻る: Experimental conditions for the

The number of word output probabilities

Figure 1 shows the number of word output probabilities versus the number of states for the Ergodic HMM. In this figure, the thin line is the result for the basic Baum-Welch algorithm and the thick line is the result for the improved Baum-Welch algorithm. The figure shows that the latter method greatly reduces the number of word output probabilities. This means that the memory requirements and computational costs have been greatly reduced.

図 1: Number of word output probabilities versus number of states
\begin{figure}\begin{center}
\fbox{\epsfig{file=figure1.ps,width=70mm}}\end{center}\end{figure}



Jin'ichi Murakami 平成13年1月19日