next up previous
次へ: Introduction

Reducing Memory Requirements and Computational Costs for The Baum-Welch Algorithm and Application to Automatic Stochastic Network Grammar Acquisition.

Jin'ichi Murakami

概要:

This paper describes new techniques for language modeling in speech recognition based on the use of a discrete density Ergodic Hidden Markov Model (HMM).

A discrete-output Ergodic HMM has a structure similar to that of a stochastic network language model (SNLM), so it can automatically function as an SNLM from a large amount of text data through the Baum-Welch algorithm. However, when the number of states in this Ergodic HMM is large, a large amount of memory is required and the computational cost is high. Therefore, past studies have limited the number of states. Consequently, the resulting perplexity of the Ergodic HMM has been high, and results as good as those obtained for word bigram models have not been obtained.

This paper proposes new techniques to reduce the memory requirements and computational costs associated with the Baum-Welch algorithm. These techniques were evaluated for their ability to automatically give an SNLM for an international conference registration task. Based on both the perplexity obtained and the results of continuous speech recognition, this Ergodic HMM was found to outperform word bigram models or trigram models. This implies that the proposed techniques are effective.



< < Here is PS file > >
next up previous
次へ: Introduction
Jin'ichi Murakami 平成13年1月19日