next up previous
次へ: Discussion 上へ: Statistical Pattern-Based Machine Translation 戻る: Decoder

Results of our Machine Translation (IWSLT 2010 Automatic Evaluation Scores)

Table 10 summarizes the results of our machine translation evaluation for the BTEC-FE task. ``IWSLT10'' indicates the IWSLT10 task set and ``IWSLT09'' indicates the IWSLT09 task set. Also, "Proposed" indicates our proposed system (RBMT+SMT), and "MOSES" indicates a standard SMT system. We obtained a BLEU score of 0.5201 in the BTEC-FE task using our proposed system. In contrast, we obtained a BLEU score of 0.5077 in the BTEC-FE task using a standard SMT system (Moses). This means that our proposed system is effective for the BTEC-FE task. Also, our proposed system had an above average BLEU score. However, our system placed 7th place out of 9 systems.


表 10: Experimental Results
IWSLT10 BLEU METEOR WER NIST
Proposed 0.5201 0.7916 0.3305 8.5812
MOSES 0.5077 0.7808 0.3365 8.4804
IWSLT09 BLEU METEOR WER NIST
Proposed 0.5670 0.7844 0.3360 9.6467
MOSES 0.5504 0.7748 0.3541 9.4419

There were 464 test sentences for IWSLT2010 task. Out of these 464 sentences, the 151 sentences matched with the French-English patterns. For the results of ``English''-English translation, the 80 sentences out of the 151 sentences were different compared to a standard SMT (Moses). The 313 sentences did not match with the French-English pattern. These 313 sentences were completely the same outputs as a standard SMT (Moses).

For IWSLT2009 task, there were 469 test sentences. Out of these 469 sentences, the 147 sentences matched with the French-English patterns. For the results of ``English''-English translation, the 77 sentences out of the 147 were different compared with a standard SMT (Moses). The 322 sentences did not match with the French-English patterns, which were completely the same outputs as a standard SMT (Moses).


表 11: Example Outputs for BTEC-FE
02 Input J'ai un rhume .
Proposed I have a cold .
PBMT I a your name cold .
MOSES I have a cold .
09 Input Puis-je voir votre billet d'avion ?
Proposed Can I see your airline ticket ?
PBMT Can I see your ticket airline ?
MOSES Can I see your airline ticket ?
11 Input Veuillez me donner votre adresse .
Proposed Please me give your address .
PBMT Please me give your address .
MOSES Please give me your address .
12 Input Je ne comprends pas .
Proposed I don't understand .
PBMT I don't understand don't .
MOSES I don't understand .
21 Input Nous nous intressons la peinture .
Proposed We're interested in painting .
PBMT We're interested in paint .
MOSES We're interested in at the paint .
24 Input C'est merveilleux .
Proposed That's wonderful .
PBMT It's wonderful much .
MOSES That's wonderful .
26 Input Combien de temps allez-vous rester ?
Proposed How long will you be staying ?
PBMT How many long stay is it ?
MOSES How long will you be staying ?
28 Input Avez-vous des pulls en cachemire ?
Proposed Do you have any sweater cashmere in ?
PBMT Do you have any sweater in cashmere ?
MOSES Do you have any pulls in of cashmere ?
30 Input C'est notre limite .
Proposed It's our your limit .
PBMT It's our your name limit .
MOSES It's our latest .
32 Input Je prends le vol dix pour Tokyo .
Proposed I'm taking flight for ten Tokyo .
PBMT I take to flight it ten for Tokyo .
MOSES I'm taking flight ten to Tokyo .
71 Input Combien en tout ?
Proposed How much is in all ?
PBMT How much is in all ?
MOSES How much altogether ?
75 Input Quel est le num辿ro de
l'ambassade japonaise ?
Proposed What's Embassy Japanese number ?
PBMT What's Embassy Japanese number ?
MOSES What's the number of Japanese ?
the Japanese embassy ?
356 Input Je voudrais louer ce type de voiture
pour une semaine .
Proposed I'd like to rent this type of car for a week .
PBMT I'd like to rent this type of car for a week .
MOSES I'd like to rent this kind of car for a week .

Table 11 lists example sentences from our proposed system for the BTEC-FE task. These example sentences are IWSLT2010 task and these sentences matched with the French-English patterns. In this table, "Input" indicates an input French sentence, "Proposed" indicates an output of our proposed system (RBMT+SMT), "PBMT" indicates an output of automatically created PBMT, and "MOSES" indicates an output of a standard SMT.


next up previous
次へ: Discussion 上へ: Statistical Pattern-Based Machine Translation 戻る: Decoder
Jin'ichi Murakami 平成22年12月20日