next up previous
Next: Results of our Machine Up: Experiments with our Machine Previous: -gram Language Model

Decoder

We used ``Moses''[Koehn
\bgroupet al.\end{tex2html_bgroup}
2007
] as a decoder. We also used parameter tuning (MERT) and reordering models. Note that in Japanese-English translation, the position of the verb is sometimes significantly changed from its original position. Thus, we used the unlimited word reordering for a standard SMT. So, we set the ``distortion-limit'' set to ``-1'' for a standard SMT. However, our system consists of two-stage machine translation, and the output of the first stage is ``English''. Consequently, word positions did not dramatically change. Therefore, we set the ``distortion-limit'' to ``6'' for the second-stage SMT for our system.



Jin'ichi Murakami 2012-11-06