next up previous
次へ: この文書について... 上へ: Statistical Machine Translation adding 戻る: Acknowledgements

文献目録

1
Peter F. Brown, Stephen A. Della Pietra, Vincent J. Della Pietra, and Robert L. Mercer. "The machinematics of machine translation: Parameter estimation", Computational Linguistics, 19(2): pp. 263-311. (1993).

2
Philipp Koehn, Franz J. Och, and Daniel Marcu. " Statistical phrase-based translation". In Marti Hearst and Mari Ostendorf, editors, HLT-NAACL 2003: Main Proceedings, pages 127.133, Edmonton, Alberta, Canada, May 27 -June 1. Association for Computational Linguistics. (2003).

3
Pierre Isabelle, Cyril Goutte, and Michel Simard., ``Domain Adaptation of MT systems through automatic post-editing'', MT Summit XI, 102, 2007.

4
Yushi Xu and Stephanie Seneff, "Two-Stage Translation: A Combined Linguistic and Statistical Machine Translation Framework", Proceedings of the Eighth Conference of the Association for Machine Translation (AMTA) 2008.

5
Jason Katz-Brown and Michael Collins, " Syntactic Reordering in Preprocessing for Japanese English Translation: MIT System Description for NTCIR-7 Patent Translation Task", Proceedings of the 7th NTCIR Workshop Meeting, 2008.

5
GIZA++, http://www.fjoch.com/GIZA++

6
SRILM, The SRI Language Modeling Toolkit, http://www.speech.sri.com/projects/srilm

7
Moses, moses.2007-05-29.tgz,
http://www.statmt.org/moses/

8
NIST Open Machine Translation,
http://www.nist.gov/speech/tests/mt

9
The METEOR Automatic Machine Translation
Evaluation System,
http://www.cs.cmu.edu/ alavie/METEOR/

10
training-phrase-model.perl
Evaluation System,
http://www.cs.cmu.edu/ alavie/METEOR/


表 8: Results
TASK BTEC_CE                  
                     
case+punc bleu meteor f1 prec recl wer per ter gtm nist
primary 0.3151 0.6169 0.6569 0.6465 0.6676 0.5590 0.4760 48.0710 0.6478 6.3834
contrastive1 0.3311 0.6109 0.6610 0.6758 0.6468 0.5377 0.4567 44.8140 0.6423 6.1511
contrastive2 0.1070 0.4697 0.5619 0.5671 0.5567 0.7017 0.6182 60.0070 0.4863 3.9727
TASK CT_CE                  
                     
case+punc bleu meteor f1 prec recl wer per ter gtm nist
primary.CRR 0.2797 0.5971 0.6306 0.6092 0.6536 0.6590 0.5099 61.3850 0.6592 5.5309
contrastive1.CRR 0.2706 0.5881 0.6189 0.5945 0.6453 0.6712 0.5113 62.4990 0.6533 5.4633
contrastive2.CRR 0.0642 0.3953 0.4928 0.5051 0.4811 0.8046 0.6823 74.9560 0.4312 3.2979
primary.ASR.1 0.2482 0.5489 0.5910 0.5773 0.6053 0.6943 0.5456 64.8360 0.6136 5.0705
contrastive1.ASR.1 0.2650 0.5610 0.6000 0.5876 0.6128 0.6647 0.5220 62.0140 0.6307 5.2804
contrastive2.ASR.1 0.0602 0.3654 0.4644 0.4822 0.4479 0.8148 0.7018 76.1960 0.4009 2.9995
TASK CT_EC                  
                     
case+punc bleu meteor f1 prec recl wer per ter gtm nist
primary.CRR 0.2759 0.5328 0.5500 0.5150 0.5900 0.7421 0.5382 68.6970 0.6914 5.3888
contrastive1.CRR 0.3391 0.5744 0.6204 0.6430 0.5994 0.5942 0.4356 52.3780 0.6930 6.1764
contrastive2.CRR 0.2300 0.5063 0.5596 0.5599 0.5594 0.6993 0.4987 63.2230 0.6304 5.4766
primary.ASR.1 0.2214 0.4417 0.4516 0.4100 0.5025 0.8518 0.6447 80.8210 0.6399 4.5091
contrastive1.ASR.1 0.2853 0.5134 0.5604 0.5784 0.5435 0.6609 0.4986 59.2510 0.6331 5.4212
contrastive2.ASR.1 0.1902 0.4483 0.4986 0.4948 0.5025 0.7627 0.5683 70.5120 0.5689 4.6699
primary: Proposed method contrastive1: Moses contrastive2: Systran


表 9: Appendix: Results with Parameter Tunings
TASK BTEC_CE                  
                     
case+punc bleu meteor f1 prec recl wer per ter gtm nist
primary 0.3351 0.6256 0.6522 0.6301 0.6759 0.5704 0.4874 0.5048 0.6613 6.5972
contrastive1 0.3423 0.6135 0.6500 0.6463 0.6538 0.5436 0.4721 0.4674 0.6551 6.5624
contrastive2 0.107 0.4697 0.5619 0.5671 0.5567 0.7017 0.6182 60.007 0.4863 3.9727

primary: Proposed method contrastive1: Moses contrastive2: Systran


next up previous
次へ: この文書について... 上へ: Statistical Machine Translation adding 戻る: Acknowledgements
Jin'ichi Murakami 平成22年2月26日