A Joint Sequence Translation Model with Integrated Reordering

Size: px
Start display at page:

Download "A Joint Sequence Translation Model with Integrated Reordering"

Transcription

1 A Joint Sequence Translation Model with Integrated Reordering Nadir Durrani Advisors: Alexander Fraser and Helmut Schmid Institute for Natural Language Processing University of Stuttgart

2 Machine Translation Problem: Automatic translation the foreign text: From Koehn

3 Machine Translation Problem: Automatic translation the foreign text: Approach: The text is actually written in English encoded with strange symbols problem is to decode these symbols Warren Weaver (1949) 3

4 The Rosetta Stone Egyptian language was a mystery for centuries The Rosetta stone is written in three scripts Hieroglyphic (used for religious documents) Demotic (common script of Egypt) Greek (language of rulers of Egypt at that time) From Koehn

5 Parallel Data 5

6 Parallel Data Parallel data available for several language pairs Several million sentences for Arabic English Chinese English 11 European languages (Europarl) Monolingual data is available in even more quantity 6

7 Statistical Machine Translation From Koehn

8 Statistical Machine Translation Main Modules Language Model N-gram models are commonly used Hierarchical models as language model Translation Model Modeling of translation units Words, Phrases, Trees Reordering Decoding Future cost estimation Search and pruning strategies 8

9 Contributions of this Work A new SMT model that Handles local and long distance reorderings uniformly Enables source-side discontinuities Enables source word deletion Overcomes the phrasal segmentation problem Considers all possible orderings during decoding Removing the distortion limit Integrates of reordering inside translation model Study of Joint Models for translation Comparing with: Moses: A Phrases-based System Noisy Channel Model N-gram Joint Source Channel Model 9

10 Road Map Literature Review Phrase-based SMT Overview Problems Joint Models N-gram-based Overview Problems This Work Model Contributions Shortcomings and Future Work Summary 10

11 Road Map Literature Review Phrase-based SMT Overview Problems Joint Models N-gram-based Overview Problems This Work Model Contributions Shortcomings and Future Work Summary 11

12 Phrase-based SMT State-of-the-art for many language pairs Morgen fliege ich nach Kanada zur Konferenz Tomorrow I will fly to the conference in Canada From Koehn

13 Phrase-based SMT Local and Non-local Reorderings State-of-the-art for many language pairs Morgen fliege ich nach Kanada zur Konferenz Tomorrow I will fly to the conference in Canada From Koehn

14 Problems Local and non-local dependencies are handled differently Gappy units are not allowed outside phrases Deletions and insertions are handled only inside phrases Weak reordering model Hard reordering limit necessary during decoding Spurious phrasal segmentation 14

15 Inconsistent Handling Local and Non-local Dependencies Er hat eine Pizza gegessen He has eaten a pizza 15

16 Inconsistent Handling Local and Non-local Dependencies p 1 p 2 Er hat eine Pizza gegessen He has eaten a pizza Phrase Table Er hat eine Pizza gegessen eine Pizza gegessen He has eaten a pizza a pizza eaten 16

17 Inconsistent Handling Local and Non-local Dependencies Sie hat eine Pizza gegessen Phrase Table Er hat He has eine Pizza gegessen eaten a pizza eine Pizza a pizza gegessen eaten Reordering is internal to phrase therefore handled by translation model independent of the reordering model!!! 17

18 Inconsistent Handling Local and Non-local Dependencies Sie hat eine Pizza gegessen Er hat schon sehr viel Schoko gegessen Phrase Table Er hat eine Pizza gegessen eine Pizza gegessen sehr viel schon Schoko He has eaten a pizza a pizza eaten so much already chocolate Non-local reorderings have to be handled by reordering model!!! 18

19 Discontinuities Handled Only Inside Phrases dann hat er ein buch gelesen then he read a book Discontinuities can be handled inside a phrase! 19

20 Discontinuities Handled Only Inside Phrases dann hat er ein buch gelesen then he read a book Discontinuities can be handled inside a phrase! dann hat er eine Zeitung gelesen Then he has read a newspaper Can not learn discontinuous phrases! 20

21 Deletions and Insertions Handled Only Inside Phrases kommen sie mit come with me Delete sie when followed by kommen or preceding mit lesen sie bitte mit Can t delete sie in this context please read you with me 21

22 PBSMT Reordering Model (Galley et al. 2008) Er p 1 p 3 p 2 He has hat eine Pizza gegessen p 3 eaten a pizza Orientation (gegessen,eaten) = Discontinuous Backward a i, a i-1 Orientation (gegessen, eaten) = Swap Forward a i, a i+1 22

23 Orientation (gegessen, eaten) Discontinuous Backward a i, a i-1 Swap Forward a i, a i+1 23

24 PBSMT Reordering Model (Galley et al. 2008) Er hat schon sehr viel Schoko gegessen Phrase Table Er hat He has gegessen eaten sehr viel so much schon already Hypotheses He has already eaten so much chocolate He has eaten so much chocolate already He has so much eaten chocolate already He has already so much eaten chocolate O(gegessen, eaten) Back Fwd D S D S D S D S Language Model is required to break the tie!!! 24

25 Distortion Limit Hard distortion limit during decoding or performance drops Reordering distance of <= 6 words Example: DE : ich 1 möchte 2 am 3 Montag 4 abend 5 für 6 unsere 7 Gäste 8 Truthahn 9 machen 10 EN : I [would like] 2 [to make] 10 Turkey for our guests on Monday 25

26 Spurious Phrasal Segmentation Training 26

27 s Spurious Phrasal Segmentation Decoding Phrase-based SMT Stack 1 Stack 2 Stack 3 Stack4 Stack 5 He has eaten He has eaten He has eaten a pizza 27

28 s Spurious Phrasal Segmentation Decoding Phrase-based SMT Stack 1 Stack 2 Stack 3 Stack4 Stack 5 He has eaten He has eaten He has eaten a pizza Minimal Units s He has eaten a pizza a pizza eaten 28

29 Road Map Literature Review Phrase-based SMT Overview Problems Joint Models N-gram-based Overview Problems This Work Model Contributions Shortcomings and Future Work Summary 29

30 Joint Probability Models for SMT Noisy Channel Model Captures how source string can be mapped to the target string Joint Model How source and target strings can be generated simultaneously 30

31 Joint Probability Models for SMT Noisy Channel Model Captures how source string can be mapped to the target string Joint Model How source and target strings can be generated simultaneously Direct Phrase Extraction through EM Marcu and Wong (2002) Unigram Orientation Model Tillman and Xia ( ) N-gram Based SMT Marino et al. (2006) Many papers to date 31

32 Joint Probability Models for SMT System Marcu and Wong Translation Units Phrases Context Unigram Reordering IBM Model 3 Tillman and Xia Phrases Unigram Orientation Model N-gram based SMT Tuples Trigram Source Linearization 32

33 Why Minimal Units? Modeling local and long distance reorderings in unified manner Avoid phrasal segmentation problem 33

34 Road Map Literature Review Phrase-based SMT Overview Problems Joint Models N-gram-based Overview Problems This Work Model Contributions Shortcomings and Future Work Summary 34

35 N-gram based SMT Based on bilingual n-gram called as tuples A tuple is extracted such that Monotonic segmentation for each bilingual sentence is produced No word inside a tuple is aligned to words outside of the tuple No smaller tuples can be extracted without violating previous conditions 35

36 Example Er hat eine Pizza gegessen He has eaten a pizza 36

37 Example Er Er hat eine Pizza gegessen He has eaten a pizza t 1 t 2 t 3 hat eine Pizza gegessen He has eaten a pizza Banchs et al

38 Translation Model 38

39 Tuple Unfolding Source Linearization t 1 t 2 t 3 Er hat eine Pizza gegessen He has eaten a pizza t 1 t 2 t 3 t 4 t 5 Er hat gegessen eine Pizza He has eaten a pizza 39

40 Reordering Re-write Rules eine Pizza gegessen gegessen eine Pizza Crego et al. (2005) POS tags to alleviate sparsity in rules DT NN VB VB DT NN Crego et al. (2006) 40

41 Lattice Based Decoding Input sentence handled as a word graph A monotonic search graphic consist of single path Er hat einen Erdbeerkuchen gegessen Graph is extended with new arcs for reordering Rule Used : DT NN VB VB DT NN 41

42 Lattice Based Decoding Input sentence handled as a word graph A monotonic search graphic consist of single path Er hat einen Erdbeerkuchen gegessen Graph is extended with new arcs for reordering gegessen einen Erdbeerkuchen Er hat einen Erdbeerkuchen gegessen 42

43 Lattice Based Decoding Search graph is traversed to find the best path Two paths for the discussed example Er hat einen Erdbeerkuchen gegessen Er hat gegessen einen Erdbeerkuchen 43

44 Problems Weak reordering model Search performed only on pre-calculated reorderings Target side is ignored in pre-calculated reorderings Heavily relies on POS tags for reordering No ability to use lexical triggers The extracted rule DT NN VB VB DT NN fails for the example: Er hat schon sehr viel Schoko gegessen 44

45 Road Map Literature Review Phrase-based SMT Overview Problems Joint Models N-gram-based Overview Problems This Work Model Contributions Shortcomings and Future Work Summary 45

46 This Work : Introduction Generation of bilingual sentence pair through a sequence of operations Operation: Translate or Reorder P (E,F,A) = Probability of the operation sequence required to generate the bilingual sentence pair 46

47 Joint Probability Models for SMT Extension of N-gram based SMT Sequence of operations rather than tuples Integrated reordering rather than source linearization + rule extraction System Translation Units Context Reordering Marcu and Wong Phrases Unigram IBM Model 3 Tillman and Xia Phrases Unigram Orientation Model N-gram based SMT Tuples Trigram Source Linearization Our Model Operations Encapsulating Tuples 9-gram Integrated Reordering 47

48 List of Operations 4 Translation Operations Generate X Y Continue Source Cept Generate Identical Generate Source Only X 3 Reordering Operations Insert Gap Jump Back (N) Jump Forward 48

49 Example Er hat eine Pizza gegessen He has eaten a pizza 49

50 Example Er hat eine Pizza gegessen He has eaten a pizza Simultaneous generation of source and target Generation is done in order of the target sentence Reorder when the source words are not in the same order 50

51 Example Er hat eine Pizza gegessen He has eaten a pizza Operations Generate Er He Er He 51

52 Example Er hat eine Pizza gegessen He has eaten a pizza Operations Generate Er He Generate hat has Er hat He has 52

53 Example Er hat eine Pizza gegessen He has eaten a pizza Operations Generate Er He Generate hat has Er hat Insert gap He has 53

54 Example Er hat eine Pizza gegessen He has eaten a pizza Operations Generate Er He Generate hat has Er hat gegessen Insert gap Generate gegessen eaten He has eaten 54

55 Example Er hat eine Pizza gegessen He has eaten a pizza Operations Generate Er He Generate hat has Er hat gegessen Insert gap Generate gegessen eaten He has eaten Jump back (1) 55

56 Example Er hat eine Pizza gegessen He has eaten a pizza Operations Generate Er He Generate hat has Er hat eine Insert gap Generate gegessen eaten Jump back (1) He has eaten a Generate eine a gegessen 56

57 Example Operations Generate Er He Generate hat has Insert gap Generate gegessen eaten Jump back (1) Generate eine a Generate Pizza pizza Er hat eine Pizza gegessen He has eaten a pizza Er hat eine Pizza gegessen He has eaten a pizza 57

58 Lexical Trigger Er hat gegessen He has eaten Generate Er He Generate hat has Insert Gap Generate gegessen eat Jump Back (1) 58

59 Generalizing to Unseen Context Er hat einen Erdbeerkuchen gegessen He has eaten a strawberry cake Generate Er-He Generate hat-has Insert Gap Generate gegessen-eat Jump Back(1) Generate einen-a Generate Erdbeerkuchen strawberry cake 59

60 Handling Non-local Reordering Er hat schon sehr viel Schoko gegessen He has eaten so much chocolate already Generate Er He Generate hat has Insert Gap Generate gegessen eat Jump Back (1) Insert Gap Generate sehr viel so much Continue Source Cept Generate Schoko, chocolate Jump Back Generate schon already 60

61 Contributions common to N-gram based SMT Using minimal units to avoid spurious phrasal segmentation Using source and target information with context Handling local and long distance reordering in a unified manner 61

62 Contributions Reordering decisions depend upon translation decisions and vice versa Example: Gap Insertion is probable after Generate hat has in order to move gegessen to right postion Ability to learn lexical triggers unlike N-gram based SMT which uses POS based re-write rules 62

63 Source Side Discontinuities er hat ein Buch gelesen he read a book 63

64 Source Side Discontinuities er hat ein Buch gelesen he read a book Operations er Generate er he he 64

65 Source Side Discontinuities er hat ein Buch gelesen he read a book Operations er hat Generate er he Generate hat [gelesen] read he read 65

66 Source Side Discontinuities er hat ein Buch gelesen he read a book Operations er hat gelesen Generate er he Generate hat [gelesen] read Insert Gap Continue Source Cept he read 66

67 Source Side Discontinuities er hat ein Buch gelesen he read a book Operations Generate er he Generate hat [gelesen] read Insert Gap Continue Source Cept Jump Back (1) Generate ein a Generate Buch book er hat ein Buch gelesen he read a book 67

68 Contribution Handling Source Side Discontinuities hat gelesen Generate hat [gelesen] read Insert Gap read Continue Source Cept 68

69 Contribution Handling Source Side Discontinuities Generate hat gelesen read hat gelesen Insert Gap Continue Source Cept read Generalizing to unseen sentences sie hat eine Zeitung gelesen Phrases with gaps (Galley and Manning 2010) 69

70 Source Word Deletion Model kommen sie mit come with me 70

71 Source Word Deletion Model kommen sie mit come with me Operations Generate kommen come Generate Source Only sie Generate mit with me 71

72 Contribution Ability to delete source words during decoding lesen sie bitte mit please read with me Models for deleting source word in Moses (Li et al. 2008) 72

73 Learning Phrases through Operation Sequences über konkrete Zahlen nicht verhandeln wollen do not want to negotiate on specific figures Phrase Pair : nicht verhandeln wollen ~ do not want to negotiate Generate (nicht, do not) Insert Gap Generate (wollen, want to) Jump Back(1) Generate (verhandeln, negotiate) 73

74 Model Joint-probability model over operation sequences 74

75 Search Search is defined as: Incorporating language model Language model (p LM ) 5-gram Operation model and prior probability (p pr ) 9-gram Implemented in a stack based beam decoder 75

76 Other Features Length Penalty : Counts the number of target words produced Deletion Penalty : Counts the number of source words deleted 76

77 Other Features Length Penalty : Counts the number of target words produced Deletion Penalty : Counts the number of source words deleted Gap Penalty : Counts the number of gaps inserted Open Gap Penalty : Number of open gaps, paid once per each translation operation 77

78 Other Features Length Penalty : Counts the number of target words produced Deletion Penalty : Counts the number of source words deleted Gap Penalty : Counts the number of gaps inserted Open Gap Penalty : Number of open gaps, paid once per each translation operation Reordering Distance : Distance from the last translated tuple Gap Width : Distance from the first open gap 78

79 Other Features Length Penalty : Counts the number of target words produced Deletion Penalty : Counts the number of source words deleted Gap Penalty : Counts the number of gaps inserted Open Gap Penalty : Number of open gaps, paid once per each translation operation Reordering Distance : Distance from the last translated tuple Gap Width : Distance from the first open gap Lexical Probabilities : Source-to-Target and Target-to-Source lexical translation probabilities 79

80 Experimental Setup Language Pairs: German, Spanish and French to English Data 4 th Version of the Europarl Corpus Bilingual Data: 200K parallel sentences (reduced version of WMT 09) ~74K News commentary + ~ 126K Europarl Monolingual Data: 500K = 300K from the monolingual corpus (news commentary) + 200K English side of bilingual corpus Standard WMT 2009 sets for tuning and testing 80

81 Training Giza++ for word alignment Training & Tuning Heuristic modification of alignments to remove target-side gaps and unaligned target words (see the paper for details) Convert word-aligned bilingual corpus into operation corpus (see paper for details) SRI-Toolkit to train n-gram language models Kneser-Ney Smoothing Tuning Parameter Tuning with Z-mert 81

82 Results Baseline: Moses (with lexicalized reordering) with defaults A 5-gram language model (same as ours) Two baselines with no distortion limit and using a reordering limit 6 Two variations of our system Using no reordering limit Using gap-width of 6 as a reordering limit 82

83 Using Non-Gappy Source Cepts Source Baseline - no reordering limit Baseline reordering limit = 6 This Work - no reordering limit This Work - reordering limit = 6 German Spanish French Moses score without reordering limit drops by more than a BLEU point Our best system This Work - no reordering limit gives Statistically significant results over Baseline reordering limit = 6 for German and Spanish Comparable results for French 83

84 Contribution Removing hard reordering limit Consider all possible permutations Example: DE : 74 1 % 2 würden 3 gegen 4 die 5 studiengebühren 6, % 9 gegen 10 die 11 praxisgebühr 12, 13 und [% 16 gegen 17 das 18 krankenhaus-taggeld 19 stimmen ] EN : 74% would vote 20 against the tuition fees, 79% against the clinical practice, and 84% against the hospital daily allowance. Moses : 74% would against the tuition fees, 79% against the clinical practice, and 84% vote 20 against the hospital daily allowance. Increasing distortion limit to 15 in Phrasal (Green et al. 2010) 84

85 Gappy + Non-Gappy Source Cepts Source This Work - no reordering limit This Work - reordering limit = 6 This Work - no reordering limit + gappy units German Spanish French This Work - reordering limit = 6 + gappy units

86 Sample Output DE: letzte Woche kehrten die Kinder zu Ihren Eltern zurück EN: Last week, the children returned to their biological parents 86

87 Why didn t Gappy-Cepts improve performance? Using all source gaps explodes the search space Source German Spanish French Gaps 965,515 1,705,156 1,473,798 No Gaps 256, , ,220 Number of tuples using 10-best translations 87

88 Why didn t Gappy-Cepts improve performance? Using all source gaps explodes the search space Source German Spanish Gaps 965,515 1,705,156 No Gaps 256, ,690 French 1,473, ,220 Number of tuples using 10-best translations Future cost is incorrectly estimated in case of gappy cepts 88

89 Heuristic Use only the gappy cepts with scores better than sum of their parts log prob(habe gemacht made) > log p(habe have) + log p(gemacht made) Source German Spanish French Gaps 965,515 1,705,156 1,473,798 No Gaps 256, , ,220 Heuristic 281, , ,869 89

90 With Gappy Source Cepts + Heuristic Source This Work - no reordering limit + gappy units This Work - reordering limit = 6 + gappy units German Spanish French This Work - no reordering limit + gappy units + heuristic This Work - reordering limit = 6 + gappy units + heuristic

91 Road Map Literature Review Phrase-based SMT Overview Problems Joint Models N-gram-based Overview Problems This Work Model Contributions Shortcomings and Future Work Summary 91

92 Shortcomings of this Model More difficult search problem Decoding Future cost estimation Target unaligned words are not allowed Target side gaps are not allowed 92

93 More difficult Search Decoding s Stack 1 Stack 2 Stack 3 Nach meine meinung In my opinion 93

94 More difficult Search Decoding s Stack 1 Stack 2 Stack 3 nach after to by meine my nach meine meinung in my opinion in 94

95 More difficult Search Future Cost Estimation Future cost estimate available to Phrase-based SMT Translation Model: p(in my opinion nach meine meinung) Language Model: p (in) * p (my in) * p (opinion in my) Future cost estimate available to N-gram SMT Translation Model: p (after, nach) * p (my, meine) * p (opinion, meinung) Language Model : p (after) * p (my) * p (opinion) 95

96 Shortcomings of this Model More difficult search problem Decoding Future cost estimation Target unaligned words are not allowed Model: Introduce operation Generate Target Only (X) Decoding: Spurious generation of target words? Target side gaps are not allowed 96

97 Shortcomings of this Model More difficult search problem Decoding Future cost estimation Target unaligned words are not allowed Model: Introduce operation Generate Target Only (X) Decoding: Spurious generation of target words? Target side gaps are not allowed 97

98 Future Work Handling of Target Gaps German English hat ein Buch gelesen Generate hat gelesen Insert Gap read a book Continue Source Cept English German read a book Generate read hat hat ein Buch gelesen Problem: Target is generated left to right when to generate gelesen 98

99 Future Work Handling Target Gaps Solution 1 : Generate read hat (gelesen) Continue Target Cept (Galley and Manning 2010) Solution 2 : Use split rules (Crego and Yvon 2009) Generate read 1 hat Generate read 2 gelesen Problem: How to split read to read 1 and read 2 during decoding? read a book read 1 a book read 2 Solution 3 : Introduce Insert Gaps operation on target-side read hat gelesen Parsing based decoders 99

100 Future Work Factored based Machine Translation (Koehn and Hoang 2007) Using POS Tags Morphological Information er hat gegessen PN AUX VB he has eaten PN AUX VB helpful for translating: Ich habe eine Tasse Tee gemacht 10 0

101 Using Source Side Information Source-side language model Source-side syntax 101

102 Using Source Side Information Source-side language model gestern hat er ein Pizza gegessen he ate a pizza yesterday 102

103 Source Side Language Model Using source-side LM to guide reordering S: gestern hat er ein Pizza gegessen he ate a pizza yesterday S : er hat gegessen ein Pizza gestern Build language model on S use as a feature (Feng et al. 2010) 103

104 Source Side Syntax Finding path of jump 104

105 Source Side Syntax Finding path of jump (NN, ADV-MO) : L VP-OC S ADV-MO POS tags of source and target words Direction of jump (Left or Right) Sequence of parse labels encountered on the traversal from start to end 105

106 Summary A new SMT Model that Integrates translation and reordering in a single generative story Uses bilingual context (like N-gram based SMT) Improved reordering mechanism Models local and non-local dependencies uniformly (unlike PBSMT) Takes previously translated words into account (unlike PBSMT) Has ability to use lexical triggers (unlike N-gram SMT) Considers all possible reorderings during search (unlike N-gram SMT) 106

107 Summary A new SMT Model that Enables source-side gaps Enables source word deletion Does not have spurious phrasal segmentation (like N-gram SMT) Does not use hard distortion limit Compared with state-of-the-art Moses system gives significantly better results for German-to-English and Spanish-to-English comparable results for French-to-English 107

108 Thank you - Questions? 108

109 Search and Future Cost Estimation The search problem is much harder than in PBSMT Larger beam needed to produce translations similar to PBSMT Example zum Beispiel for example vs zum for, Beispiel example Problem with future cost estimation Language model probability Phrase based : p(for) * p(example for) Our Model : p(for) * p(example) Future Cost for reordering operations Future Cost for features gap penalty, gap-width and reordering distance

110 Future Cost Estimation with Source-Side Gaps Future Cost estimation with source side gaps is problematic Future Cost for Bigger Spans cost (I,K) = min( cost (I,J) + cost (J+1,K) ) for all J in I K cost (1,8) = min ( { cost (1,1) + cost (2,8) }, {cost (1,2) + cost (3,7)},, {cost(1,7) + cost(8,8)}

111 Future Cost Estimation with Source-Side Gaps FC estimation with source side gaps is problematic Future Cost for Bigger Spans cost (I,K) = min( cost (I,J) + cost (J+1,K) ) for all J in I K cost (1,8) = min ( { cost (1,1) + cost (2,8) }, {cost (1,2) + cost (3,7)},, {cost(1,7) + cost(8,8)}

112 Future Cost Estimation with Source-Side Gaps FC estimation with source side gaps is problematic Future Cost for Bigger Spans cost (I,K) = min( cost (I,J) + cost (J+1,K) ) for all J in I K cost (1,8) = min ( { cost (1,1) + cost (2,8) }, {cost (1,2) + cost (3,7)},, {cost(1,7) + cost(8,8)}

113 Future Cost Estimation with Source-Side Gaps Does not work for cepts with gaps Best way to cover word 1,4 and 8 is through cept cost (1,8) =? After computation of cost (1,8) we do another pass to find min (cost (1,8), cost (2,3) + cost (5,7) + cost_of_cept ( )

114 Future Cost Estimation with Source-Side Gaps Does not work for cepts with gaps Best way to cover word 1,4 and 8 is through cept cost (1,8) =? After computation of cost (1,8) we do another pass to find min (cost (1,8), cost (2,3) + cost (5,7) + cost_of_cept ( )

115 Future Cost Estimation with Source-Side Gaps Does not work for cepts with gaps Best way to cover word 1,4 and 8 is through cept cost (1,8) =? After computation of cost (1,8) we do another pass to find min (cost (1,8), cost (2,3) + cost (5,7) + cost_of_cept ( )

116 Future Cost Estimation with Source-Side Gaps Does not work for cepts with gaps Best way to cover word 1,4 and 8 is through cept cost (1,8) =? min (cost (1,8), cost (2,3) + cost (5,7) + cost_of_cept ( ), cost (3,7) + cost_of_cept ( ))

117 Future Cost Estimation with Source-Side Gaps Still problematic when gappy Cepts interleave Example: Consider best way to cover 1 & 5 is through cept 1 5 Modification can not capture that best cost = cost_of_cept (1..5) + cost_of_cept( ) + cost (3,3) + cost (6,7)

118 Future Cost Estimation with Source-Side Gaps Gives incorrect cost if coverage vector already covers a word between the gappy cept Decoder has covered 3 Future cost estimate cost (1,2) + cost (4,8) is wrong The correct estimate is cost_of_cept (1 4 8) + cost (2,2) + cost (5,8) No efficient way to cover all possible permutations

119 Target Side Gaps & Unaligned Words Our model does not allow target-side gaps and target unaligned words Post-editing of alignments a 3 step process Step-I: Remove all target-side gaps For a gappy alignment, link to least frequent target word is identified A group of link that contain this word is retained Example A B C D U V W X Y Z Target Side Discontinuity!!

120 Target Side Gaps & Unaligned Words Our model does not allow target-side gaps and target unaligned words Post-editing of alignments a 3 step process Step-I: Remove all target-side gaps For a gappy alignment, link to least frequent target word is identified A group of link that contain this word is retained Example A B C D U V W X Y Z

121 Target Side Gaps & Unaligned Words Our model does not allow target-side gaps and target unaligned words Post-editing of alignments a 3 step process Step-I: Remove all target-side gaps For a gappy alignment, link to least frequent target word is identified A group of link that contain this word is retained Example A B C D U V W X Y Z No target side gaps but target unaligned words!!!

122 Continued After Step-I A B C D U V W X Y Z Step-II: Counting over the training corpus to find the attachment preference of a word Count (U,V) = 1 Count (W,X) = 1 Count (W,X) = 1 Count (X,Y) = 0.5 Count (Y,Z) = 0.5

123 Continued Step-III: Attached target-unaligned words to right or left based on the collected counts After Step-III A B C D U V W X Y Z

124 List of Operations 4 Translation Operations Generate (X,Y) Continue Source Cept Generate Identical Generate Source Only (X) Example Generate (gegessen, eaten) 3 Reordering Operations Insert Gap Jump Back (N) Jump Forward

125 List of Operations 4 Translation Operations Generate (X,Y) Continue Source Cept Generate Identical Generate Source Only (X) Example Generate (Inflationsraten, inflation rate) Inflationsraten Inflation rate 3 Reordering Operations Insert Gap Jump Back (N) Jump Forward

126 List of Operations 4 Translation Operations Generate (X,Y) Continue Source Cept Generate Identical Generate Source Only (X) 3 Reordering Operations Insert Gap Jump Back (N) Jump Forward Example kehrten zurück returned Generate (kehrten zurück, returned) Insert Gap Continue Source Cept

127 List of Operations 4 Translation Operations Generate (X,Y) Continue Source Cept Generate Identical Generate Source Only (X) Example Generate Identical instead of Generate (Portland, Portland) 3 Reordering Operations Insert Gap Jump Back (N) Jump Forward If count (Portland) = 1

128 List of Operations 4 Translation Operations Generate (X,Y) Continue Source Cept Generate Identical Generate Source Only (X) Example kommen Sie mit come with me Generate Source Only (Sie) 3 Reordering Operations Insert Gap Jump Back (N) Jump Forward

129 List of Operations 4 Translation Operations Generate (X,Y) Continue Source Cept Generate Identical Generate Source Only (X) Example über konkrete Zahlen nicht verhandeln wollen do not want to negotiate on specific figures 3 Reordering Operations Insert Gap Jump Back (N) Jump Forward Gap # 1 do not nicht

130 List of Operations 4 Translation Operations Generate (X,Y) Continue Source Cept Generate Identical Generate Source Only (X) Example über konkrete Zahlen nicht verhandeln wollen do not want to negotiate on specific figures 3 Reordering Operations Insert Gap Jump Back (N) Jump Forward Gap # 2 Gap # 1 nicht wollen do not want to Jump Back (1)!!!

131 List of Operations 4 Translation Operations Generate (X,Y) Continue Source Cept Generate Identical Generate Source Only (X) Example über konkrete Zahlen nicht verhandeln wollen do not want to negotiate on specific figures 3 Reordering Operations Insert Gap Jump Back (N) Jump Forward Gap # 1 do not want to negotiate nicht verhandeln wollen

132 List of Operations 4 Translation Operations Generate (X,Y) Continue Source Cept Generate Identical Generate Source Only (X) Example über konkrete Zahlen nicht verhandeln wollen do not want to negotiate on specific figures 3 Reordering Operations Insert Gap Jump Back (N) Jump Forward Gap # 1 nicht verhandeln wollen do not want to negotiate Jump Back (1)!!!

133 List of Operations 4 Translation Operations Generate (X,Y) Continue Source Cept Generate Identical Generate Source Only (X) 3 Reordering Operations Insert Gap Jump Back (N) Jump Forward über konkrete Zahlen nicht verhandeln wollen do not want to negotiate on specific figures

134 List of Operations 4 Translation Operations Generate (X,Y) Continue Source Cept Generate Identical Generate Source Only (X) Jump Forward!!! 3 Reordering Operations Insert Gap über konkrete Zahlen nicht verhandeln wollen Jump Back (N) Jump Forward do not want to negotiate on specific figures

135 List of Operations 4 Translation Operations Generate (X,Y) Continue Source Cept Generate Identical Generate Source Only (X) 3 Reordering Operations Insert Gap Jump Back (N) Jump Forward über konkrete Zahlen nicht verhandeln wollen. do not want to negotiate on specific figures.

A Joint Sequence Translation Model with Integrated Reordering

A Joint Sequence Translation Model with Integrated Reordering A Joint Sequence Translation Model with Integrated Reordering Nadir Durrani, Helmut Schmid and Alexander Fraser Institute for Natural Language Processing University of Stuttgart Introduction Generation

More information

Phrase-Based MT. Machine Translation Lecture 7. Instructor: Chris Callison-Burch TAs: Mitchell Stern, Justin Chiu. Website: mt-class.

Phrase-Based MT. Machine Translation Lecture 7. Instructor: Chris Callison-Burch TAs: Mitchell Stern, Justin Chiu. Website: mt-class. Phrase-Based MT Machine Translation Lecture 7 Instructor: Chris Callison-Burch TAs: Mitchell Stern, Justin Chiu Website: mt-class.org/penn Translational Equivalence Er hat die Prüfung bestanden, jedoch

More information

Statistical Machine Translation Lecture 4. Beyond IBM Model 1 to Phrase-Based Models

Statistical Machine Translation Lecture 4. Beyond IBM Model 1 to Phrase-Based Models p. Statistical Machine Translation Lecture 4 Beyond IBM Model 1 to Phrase-Based Models Stephen Clark based on slides by Philipp Koehn p. Model 2 p Introduces more realistic assumption for the alignment

More information

Chapter 5. Phrase-based models. Statistical Machine Translation

Chapter 5. Phrase-based models. Statistical Machine Translation Chapter 5 Phrase-based models Statistical Machine Translation Motivation Word-Based Models translate words as atomic units Phrase-Based Models translate phrases as atomic units Advantages: many-to-many

More information

Chapter 6. Decoding. Statistical Machine Translation

Chapter 6. Decoding. Statistical Machine Translation Chapter 6 Decoding Statistical Machine Translation Decoding We have a mathematical model for translation p(e f) Task of decoding: find the translation e best with highest probability Two types of error

More information

Statistical Machine Translation

Statistical Machine Translation Statistical Machine Translation Some of the content of this lecture is taken from previous lectures and presentations given by Philipp Koehn and Andy Way. Dr. Jennifer Foster National Centre for Language

More information

HIERARCHICAL HYBRID TRANSLATION BETWEEN ENGLISH AND GERMAN

HIERARCHICAL HYBRID TRANSLATION BETWEEN ENGLISH AND GERMAN HIERARCHICAL HYBRID TRANSLATION BETWEEN ENGLISH AND GERMAN Yu Chen, Andreas Eisele DFKI GmbH, Saarbrücken, Germany May 28, 2010 OUTLINE INTRODUCTION ARCHITECTURE EXPERIMENTS CONCLUSION SMT VS. RBMT [K.

More information

Convergence of Translation Memory and Statistical Machine Translation

Convergence of Translation Memory and Statistical Machine Translation Convergence of Translation Memory and Statistical Machine Translation Philipp Koehn and Jean Senellart 4 November 2010 Progress in Translation Automation 1 Translation Memory (TM) translators store past

More information

The Prague Bulletin of Mathematical Linguistics NUMBER 96 OCTOBER 2011 49 58. Ncode: an Open Source Bilingual N-gram SMT Toolkit

The Prague Bulletin of Mathematical Linguistics NUMBER 96 OCTOBER 2011 49 58. Ncode: an Open Source Bilingual N-gram SMT Toolkit The Prague Bulletin of Mathematical Linguistics NUMBER 96 OCTOBER 2011 49 58 Ncode: an Open Source Bilingual N-gram SMT Toolkit Josep M. Crego a, François Yvon ab, José B. Mariño c c a LIMSI-CNRS, BP 133,

More information

Factored bilingual n-gram language models for statistical machine translation

Factored bilingual n-gram language models for statistical machine translation Mach Translat DOI 10.1007/s10590-010-9082-5 Factored bilingual n-gram language models for statistical machine translation Josep M. Crego François Yvon Received: 2 November 2009 / Accepted: 12 June 2010

More information

LIUM s Statistical Machine Translation System for IWSLT 2010

LIUM s Statistical Machine Translation System for IWSLT 2010 LIUM s Statistical Machine Translation System for IWSLT 2010 Anthony Rousseau, Loïc Barrault, Paul Deléglise, Yannick Estève Laboratoire Informatique de l Université du Maine (LIUM) University of Le Mans,

More information

The KIT Translation system for IWSLT 2010

The KIT Translation system for IWSLT 2010 The KIT Translation system for IWSLT 2010 Jan Niehues 1, Mohammed Mediani 1, Teresa Herrmann 1, Michael Heck 2, Christian Herff 2, Alex Waibel 1 Institute of Anthropomatics KIT - Karlsruhe Institute of

More information

Computer Aided Translation

Computer Aided Translation Computer Aided Translation Philipp Koehn 30 April 2015 Why Machine Translation? 1 Assimilation reader initiates translation, wants to know content user is tolerant of inferior quality focus of majority

More information

SYSTRAN 混 合 策 略 汉 英 和 英 汉 机 器 翻 译 系 统

SYSTRAN 混 合 策 略 汉 英 和 英 汉 机 器 翻 译 系 统 SYSTRAN Chinese-English and English-Chinese Hybrid Machine Translation Systems Jin Yang, Satoshi Enoue Jean Senellart, Tristan Croiset SYSTRAN Software, Inc. SYSTRAN SA 9333 Genesee Ave. Suite PL1 La Grande

More information

tance alignment and time information to create confusion networks 1 from the output of different ASR systems for the same

tance alignment and time information to create confusion networks 1 from the output of different ASR systems for the same 1222 IEEE TRANSACTIONS ON AUDIO, SPEECH, AND LANGUAGE PROCESSING, VOL. 16, NO. 7, SEPTEMBER 2008 System Combination for Machine Translation of Spoken and Written Language Evgeny Matusov, Student Member,

More information

SYSTRAN Chinese-English and English-Chinese Hybrid Machine Translation Systems for CWMT2011 SYSTRAN 混 合 策 略 汉 英 和 英 汉 机 器 翻 译 系 CWMT2011 技 术 报 告

SYSTRAN Chinese-English and English-Chinese Hybrid Machine Translation Systems for CWMT2011 SYSTRAN 混 合 策 略 汉 英 和 英 汉 机 器 翻 译 系 CWMT2011 技 术 报 告 SYSTRAN Chinese-English and English-Chinese Hybrid Machine Translation Systems for CWMT2011 Jin Yang and Satoshi Enoue SYSTRAN Software, Inc. 4444 Eastgate Mall, Suite 310 San Diego, CA 92121, USA E-mail:

More information

Machine Translation and the Translator

Machine Translation and the Translator Machine Translation and the Translator Philipp Koehn 8 April 2015 About me 1 Professor at Johns Hopkins University (US), University of Edinburgh (Scotland) Author of textbook on statistical machine translation

More information

The XMU Phrase-Based Statistical Machine Translation System for IWSLT 2006

The XMU Phrase-Based Statistical Machine Translation System for IWSLT 2006 The XMU Phrase-Based Statistical Machine Translation System for IWSLT 2006 Yidong Chen, Xiaodong Shi Institute of Artificial Intelligence Xiamen University P. R. China November 28, 2006 - Kyoto 13:46 1

More information

Factored Markov Translation with Robust Modeling

Factored Markov Translation with Robust Modeling Factored Markov Translation with Robust Modeling Yang Feng Trevor Cohn Xinkai Du Information Sciences Institue Computing and Information Systems Computer Science Department The University of Melbourne

More information

Factored Translation Models

Factored Translation Models Factored Translation s Philipp Koehn and Hieu Hoang pkoehn@inf.ed.ac.uk, H.Hoang@sms.ed.ac.uk School of Informatics University of Edinburgh 2 Buccleuch Place, Edinburgh EH8 9LW Scotland, United Kingdom

More information

The TCH Machine Translation System for IWSLT 2008

The TCH Machine Translation System for IWSLT 2008 The TCH Machine Translation System for IWSLT 2008 Haifeng Wang, Hua Wu, Xiaoguang Hu, Zhanyi Liu, Jianfeng Li, Dengjun Ren, Zhengyu Niu Toshiba (China) Research and Development Center 5/F., Tower W2, Oriental

More information

Machine Translation. Agenda

Machine Translation. Agenda Agenda Introduction to Machine Translation Data-driven statistical machine translation Translation models Parallel corpora Document-, sentence-, word-alignment Phrase-based translation MT decoding algorithm

More information

The University of Maryland Statistical Machine Translation System for the Fifth Workshop on Machine Translation

The University of Maryland Statistical Machine Translation System for the Fifth Workshop on Machine Translation The University of Maryland Statistical Machine Translation System for the Fifth Workshop on Machine Translation Vladimir Eidelman, Chris Dyer, and Philip Resnik UMIACS Laboratory for Computational Linguistics

More information

THUTR: A Translation Retrieval System

THUTR: A Translation Retrieval System THUTR: A Translation Retrieval System Chunyang Liu, Qi Liu, Yang Liu, and Maosong Sun Department of Computer Science and Technology State Key Lab on Intelligent Technology and Systems National Lab for

More information

Adapting General Models to Novel Project Ideas

Adapting General Models to Novel Project Ideas The KIT Translation Systems for IWSLT 2013 Thanh-Le Ha, Teresa Herrmann, Jan Niehues, Mohammed Mediani, Eunah Cho, Yuqi Zhang, Isabel Slawik and Alex Waibel Institute for Anthropomatics KIT - Karlsruhe

More information

Why Evaluation? Machine Translation. Evaluation. Evaluation Metrics. Ten Translations of a Chinese Sentence. How good is a given system?

Why Evaluation? Machine Translation. Evaluation. Evaluation Metrics. Ten Translations of a Chinese Sentence. How good is a given system? Why Evaluation? How good is a given system? Machine Translation Evaluation Which one is the best system for our purpose? How much did we improve our system? How can we tune our system to become better?

More information

Machine Translation. Why Evaluation? Evaluation. Ten Translations of a Chinese Sentence. Evaluation Metrics. But MT evaluation is a di cult problem!

Machine Translation. Why Evaluation? Evaluation. Ten Translations of a Chinese Sentence. Evaluation Metrics. But MT evaluation is a di cult problem! Why Evaluation? How good is a given system? Which one is the best system for our purpose? How much did we improve our system? How can we tune our system to become better? But MT evaluation is a di cult

More information

Jane 2: Open Source Phrase-based and Hierarchical Statistical Machine Translation

Jane 2: Open Source Phrase-based and Hierarchical Statistical Machine Translation Jane 2: Open Source Phrase-based and Hierarchical Statistical Machine Translation Joern Wuebker M at thias Huck Stephan Peitz M al te Nuhn M arkus F reitag Jan-Thorsten Peter Saab M ansour Hermann N e

More information

Hybrid Machine Translation Guided by a Rule Based System

Hybrid Machine Translation Guided by a Rule Based System Hybrid Machine Translation Guided by a Rule Based System Cristina España-Bonet, Gorka Labaka, Arantza Díaz de Ilarraza, Lluís Màrquez Kepa Sarasola Universitat Politècnica de Catalunya University of the

More information

Statistical Machine Translation

Statistical Machine Translation Statistical Machine Translation What works and what does not Andreas Maletti Universität Stuttgart maletti@ims.uni-stuttgart.de Stuttgart May 14, 2013 Statistical Machine Translation A. Maletti 1 Main

More information

AP WORLD LANGUAGE AND CULTURE EXAMS 2012 SCORING GUIDELINES

AP WORLD LANGUAGE AND CULTURE EXAMS 2012 SCORING GUIDELINES AP WORLD LANGUAGE AND CULTURE EXAMS 2012 SCORING GUIDELINES Interpersonal Writing: E-mail Reply 5: STRONG performance in Interpersonal Writing Maintains the exchange with a response that is clearly appropriate

More information

An Iteratively-Trained Segmentation-Free Phrase Translation Model for Statistical Machine Translation

An Iteratively-Trained Segmentation-Free Phrase Translation Model for Statistical Machine Translation An Iteratively-Trained Segmentation-Free Phrase Translation Model for Statistical Machine Translation Robert C. Moore Chris Quirk Microsoft Research Redmond, WA 98052, USA {bobmoore,chrisq}@microsoft.com

More information

A New Input Method for Human Translators: Integrating Machine Translation Effectively and Imperceptibly

A New Input Method for Human Translators: Integrating Machine Translation Effectively and Imperceptibly Proceedings of the Twenty-Fourth International Joint Conference on Artificial Intelligence (IJCAI 2015) A New Input Method for Human Translators: Integrating Machine Translation Effectively and Imperceptibly

More information

Factored Language Models for Statistical Machine Translation

Factored Language Models for Statistical Machine Translation Factored Language Models for Statistical Machine Translation Amittai E. Axelrod E H U N I V E R S I T Y T O H F G R E D I N B U Master of Science by Research Institute for Communicating and Collaborative

More information

Machine Learning for natural language processing

Machine Learning for natural language processing Machine Learning for natural language processing Introduction Laura Kallmeyer Heinrich-Heine-Universität Düsseldorf Summer 2016 1 / 13 Introduction Goal of machine learning: Automatically learn how to

More information

FOR TEACHERS ONLY The University of the State of New York

FOR TEACHERS ONLY The University of the State of New York FOR TEACHERS ONLY The University of the State of New York REGENTS HIGH SCHOOL EXAMINATION G COMPREHENSIVE EXAMINATION IN GERMAN Friday, June 15, 2007 1:15 to 4:15 p.m., only SCORING KEY Updated information

More information

Towards a General and Extensible Phrase-Extraction Algorithm

Towards a General and Extensible Phrase-Extraction Algorithm Towards a General and Extensible Phrase-Extraction Algorithm Wang Ling, Tiago Luís, João Graça, Luísa Coheur and Isabel Trancoso L 2 F Spoken Systems Lab INESC-ID Lisboa {wang.ling,tiago.luis,joao.graca,luisa.coheur,imt}@l2f.inesc-id.pt

More information

An End-to-End Discriminative Approach to Machine Translation

An End-to-End Discriminative Approach to Machine Translation An End-to-End Discriminative Approach to Machine Translation Percy Liang Alexandre Bouchard-Côté Dan Klein Ben Taskar Computer Science Division, EECS Department University of California at Berkeley Berkeley,

More information

Statistical Machine Translation: IBM Models 1 and 2

Statistical Machine Translation: IBM Models 1 and 2 Statistical Machine Translation: IBM Models 1 and 2 Michael Collins 1 Introduction The next few lectures of the course will be focused on machine translation, and in particular on statistical machine translation

More information

Semantics in Statistical Machine Translation

Semantics in Statistical Machine Translation Semantics in Statistical Machine Translation Mihael Arcan DERI, NUI Galway firstname.lastname@deri.org Copyright 2011. All rights reserved. Overview 1. Statistical Machine Translation (SMT) 2. Translations

More information

The Impact of Morphological Errors in Phrase-based Statistical Machine Translation from English and German into Swedish

The Impact of Morphological Errors in Phrase-based Statistical Machine Translation from English and German into Swedish The Impact of Morphological Errors in Phrase-based Statistical Machine Translation from English and German into Swedish Oscar Täckström Swedish Institute of Computer Science SE-16429, Kista, Sweden oscar@sics.se

More information

Visualizing Data Structures in Parsing-based Machine Translation. Jonathan Weese, Chris Callison-Burch

Visualizing Data Structures in Parsing-based Machine Translation. Jonathan Weese, Chris Callison-Burch The Prague Bulletin of Mathematical Linguistics NUMBER 93 JANUARY 2010 127 136 Visualizing Data Structures in Parsing-based Machine Translation Jonathan Weese, Chris Callison-Burch Abstract As machine

More information

PBML. logo. The Prague Bulletin of Mathematical Linguistics NUMBER??? JANUARY 2009 1 15. Grammar based statistical MT on Hadoop

PBML. logo. The Prague Bulletin of Mathematical Linguistics NUMBER??? JANUARY 2009 1 15. Grammar based statistical MT on Hadoop PBML logo The Prague Bulletin of Mathematical Linguistics NUMBER??? JANUARY 2009 1 15 Grammar based statistical MT on Hadoop An end-to-end toolkit for large scale PSCFG based MT Ashish Venugopal, Andreas

More information

Chapter 7. Language models. Statistical Machine Translation

Chapter 7. Language models. Statistical Machine Translation Chapter 7 Language models Statistical Machine Translation Language models Language models answer the question: How likely is a string of English words good English? Help with reordering p lm (the house

More information

11/15/10 + NELL. Natural Language Processing. NELL: Never-Ending Language Learning

11/15/10 + NELL. Natural Language Processing. NELL: Never-Ending Language Learning + + http://www.youtube.com/watch?v=u_gbswe_kye http://en.wikipedia.org/wiki/vocaloid Natural Language Processing CS151 David Kauchak + NELL + NELL NELL: Never-Ending Language Learning http://rtw.ml.cmu.edu/rtw/

More information

Polish - English Statistical Machine Translation of Medical Texts.

Polish - English Statistical Machine Translation of Medical Texts. Polish - English Statistical Machine Translation of Medical Texts. Krzysztof Wołk, Krzysztof Marasek Department of Multimedia Polish Japanese Institute of Information Technology kwolk@pjwstk.edu.pl Abstract.

More information

UNSUPERVISED MORPHOLOGICAL SEGMENTATION FOR STATISTICAL MACHINE TRANSLATION

UNSUPERVISED MORPHOLOGICAL SEGMENTATION FOR STATISTICAL MACHINE TRANSLATION UNSUPERVISED MORPHOLOGICAL SEGMENTATION FOR STATISTICAL MACHINE TRANSLATION by Ann Clifton B.A., Reed College, 2001 a thesis submitted in partial fulfillment of the requirements for the degree of Master

More information

Comprendium Translator System Overview

Comprendium Translator System Overview Comprendium System Overview May 2004 Table of Contents 1. INTRODUCTION...3 2. WHAT IS MACHINE TRANSLATION?...3 3. THE COMPRENDIUM MACHINE TRANSLATION TECHNOLOGY...4 3.1 THE BEST MT TECHNOLOGY IN THE MARKET...4

More information

The Adjunct Network Marketing Model

The Adjunct Network Marketing Model SMOOTHING A PBSMT MODEL BY FACTORING OUT ADJUNCTS MSc Thesis (Afstudeerscriptie) written by Sophie I. Arnoult (born April 3rd, 1976 in Suresnes, France) under the supervision of Dr Khalil Sima an, and

More information

Automatic Speech Recognition and Hybrid Machine Translation for High-Quality Closed-Captioning and Subtitling for Video Broadcast

Automatic Speech Recognition and Hybrid Machine Translation for High-Quality Closed-Captioning and Subtitling for Video Broadcast Automatic Speech Recognition and Hybrid Machine Translation for High-Quality Closed-Captioning and Subtitling for Video Broadcast Hassan Sawaf Science Applications International Corporation (SAIC) 7990

More information

Collaborative Machine Translation Service for Scientific texts

Collaborative Machine Translation Service for Scientific texts Collaborative Machine Translation Service for Scientific texts Patrik Lambert patrik.lambert@lium.univ-lemans.fr Jean Senellart Systran SA senellart@systran.fr Laurent Romary Humboldt Universität Berlin

More information

Statistical Machine Translation of French and German into English Using IBM Model 2 Greedy Decoding

Statistical Machine Translation of French and German into English Using IBM Model 2 Greedy Decoding Statistical Machine Translation of French and German into English Using IBM Model 2 Greedy Decoding Michael Turitzin Department of Computer Science Stanford University, Stanford, CA turitzin@stanford.edu

More information

Systematic Comparison of Professional and Crowdsourced Reference Translations for Machine Translation

Systematic Comparison of Professional and Crowdsourced Reference Translations for Machine Translation Systematic Comparison of Professional and Crowdsourced Reference Translations for Machine Translation Rabih Zbib, Gretchen Markiewicz, Spyros Matsoukas, Richard Schwartz, John Makhoul Raytheon BBN Technologies

More information

Adaptation to Hungarian, Swedish, and Spanish

Adaptation to Hungarian, Swedish, and Spanish www.kconnect.eu Adaptation to Hungarian, Swedish, and Spanish Deliverable number D1.4 Dissemination level Public Delivery date 31 January 2016 Status Author(s) Final Jindřich Libovický, Aleš Tamchyna,

More information

ACCURAT Analysis and Evaluation of Comparable Corpora for Under Resourced Areas of Machine Translation www.accurat-project.eu Project no.

ACCURAT Analysis and Evaluation of Comparable Corpora for Under Resourced Areas of Machine Translation www.accurat-project.eu Project no. ACCURAT Analysis and Evaluation of Comparable Corpora for Under Resourced Areas of Machine Translation www.accurat-project.eu Project no. 248347 Deliverable D5.4 Report on requirements, implementation

More information

Introduction. Philipp Koehn. 28 January 2016

Introduction. Philipp Koehn. 28 January 2016 Introduction Philipp Koehn 28 January 2016 Administrativa 1 Class web site: http://www.mt-class.org/jhu/ Tuesdays and Thursdays, 1:30-2:45, Hodson 313 Instructor: Philipp Koehn (with help from Matt Post)

More information

Exemplar for Internal Assessment Resource German Level 1. Resource title: Planning a School Exchange

Exemplar for Internal Assessment Resource German Level 1. Resource title: Planning a School Exchange Exemplar for internal assessment resource German 1.5A for Achievement Standard 90887! Exemplar for Internal Assessment Resource German Level 1 Resource title: Planning a School Exchange This exemplar supports

More information

Exemplar for Internal Achievement Standard. German Level 1

Exemplar for Internal Achievement Standard. German Level 1 Exemplar for Internal Achievement Standard German Level 1 This exemplar supports assessment against: Achievement Standard 90885 Interact using spoken German to communicate personal information, ideas and

More information

Leveraging ASEAN Economic Community through Language Translation Services

Leveraging ASEAN Economic Community through Language Translation Services Leveraging ASEAN Economic Community through Language Translation Services Hammam Riza Center for Information and Communication Technology Agency for the Assessment and Application of Technology (BPPT)

More information

Deciphering Foreign Language

Deciphering Foreign Language Deciphering Foreign Language NLP 1! Sujith Ravi and Kevin Knight sravi@usc.edu, knight@isi.edu Information Sciences Institute University of Southern California! 2 Statistical Machine Translation (MT) Current

More information

Building a Web-based parallel corpus and filtering out machinetranslated

Building a Web-based parallel corpus and filtering out machinetranslated Building a Web-based parallel corpus and filtering out machinetranslated text Alexandra Antonova, Alexey Misyurev Yandex 16, Leo Tolstoy St., Moscow, Russia {antonova, misyurev}@yandex-team.ru Abstract

More information

SCHOOL OF ENGINEERING AND INFORMATION TECHNOLOGIES GRADUATE PROGRAMS

SCHOOL OF ENGINEERING AND INFORMATION TECHNOLOGIES GRADUATE PROGRAMS INSTITUTO TECNOLÓGICO Y DE ESTUDIOS SUPERIORES DE MONTERREY CAMPUS MONTERREY SCHOOL OF ENGINEERING AND INFORMATION TECHNOLOGIES GRADUATE PROGRAMS DOCTOR OF PHILOSOPHY in INFORMATION TECHNOLOGIES AND COMMUNICATIONS

More information

Modalverben Theorie. learning target. rules. Aim of this section is to learn how to use modal verbs.

Modalverben Theorie. learning target. rules. Aim of this section is to learn how to use modal verbs. learning target Aim of this section is to learn how to use modal verbs. German Ich muss nach Hause gehen. Er sollte das Buch lesen. Wir können das Visum bekommen. English I must go home. He should read

More information

Scalable Inference and Training of Context-Rich Syntactic Translation Models

Scalable Inference and Training of Context-Rich Syntactic Translation Models Scalable Inference and Training of Context-Rich Syntactic Translation Models Michel Galley *, Jonathan Graehl, Kevin Knight, Daniel Marcu, Steve DeNeefe, Wei Wang and Ignacio Thayer * Columbia University

More information

Predicting the Stock Market with News Articles

Predicting the Stock Market with News Articles Predicting the Stock Market with News Articles Kari Lee and Ryan Timmons CS224N Final Project Introduction Stock market prediction is an area of extreme importance to an entire industry. Stock price is

More information

Search Engines Chapter 2 Architecture. 14.4.2011 Felix Naumann

Search Engines Chapter 2 Architecture. 14.4.2011 Felix Naumann Search Engines Chapter 2 Architecture 14.4.2011 Felix Naumann Overview 2 Basic Building Blocks Indexing Text Acquisition Text Transformation Index Creation Querying User Interaction Ranking Evaluation

More information

The finite verb and the clause: IP

The finite verb and the clause: IP Introduction to General Linguistics WS12/13 page 1 Syntax 6 The finite verb and the clause: Course teacher: Sam Featherston Important things you will learn in this section: The head of the clause The positions

More information

AMTA 2012. 10 th Biennial Conference of the Association for Machine Translation in the Americas. San Diego, Oct 28 Nov 1, 2012

AMTA 2012. 10 th Biennial Conference of the Association for Machine Translation in the Americas. San Diego, Oct 28 Nov 1, 2012 AMTA 2012 10 th Biennial Conference of the Association for Machine Translation in the Americas San Diego, Oct 28 Nov 1, 2012 http://amta2012.amtaweb.org/ Scope MT als akademisches Thema (mit abgelehnten

More information

AP GERMAN LANGUAGE AND CULTURE EXAM 2015 SCORING GUIDELINES

AP GERMAN LANGUAGE AND CULTURE EXAM 2015 SCORING GUIDELINES AP GERMAN LANGUAGE AND CULTURE EXAM 2015 SCORING GUIDELINES Identical to Scoring Guidelines used for French, Italian, and Spanish Language and Culture Exams Interpersonal Writing: E-mail Reply 5: STRONG

More information

Chinese-Japanese Machine Translation Exploiting Chinese Characters

Chinese-Japanese Machine Translation Exploiting Chinese Characters Chinese-Japanese Machine Translation Exploiting Chinese Characters CHENHUI CHU, TOSHIAKI NAKAZAWA, DAISUKE KAWAHARA, and SADAO KUROHASHI, Kyoto University The Chinese and Japanese languages share Chinese

More information

Customizing an English-Korean Machine Translation System for Patent Translation *

Customizing an English-Korean Machine Translation System for Patent Translation * Customizing an English-Korean Machine Translation System for Patent Translation * Sung-Kwon Choi, Young-Gil Kim Natural Language Processing Team, Electronics and Telecommunications Research Institute,

More information

Neural Machine Transla/on for Spoken Language Domains. Thang Luong IWSLT 2015 (Joint work with Chris Manning)

Neural Machine Transla/on for Spoken Language Domains. Thang Luong IWSLT 2015 (Joint work with Chris Manning) Neural Machine Transla/on for Spoken Language Domains Thang Luong IWSLT 2015 (Joint work with Chris Manning) Neural Machine Transla/on (NMT) End- to- end neural approach to MT: Simple and coherent. Achieved

More information

Arabic Recognition and Translation System

Arabic Recognition and Translation System Escola Tècnica Superior d Enginyeria Informàtica Universitat Politècnica de València Arabic Recognition and Translation System Proyecto Final de Carrera Ingeniería Informática Autor: Ihab Alkhoury Directores:

More information

Data at the SFB "Mehrsprachigkeit"

Data at the SFB Mehrsprachigkeit 1 Workshop on multilingual data, 08 July 2003 MULTILINGUAL DATABASE: Obstacles and Opportunities Thomas Schmidt, Project Zb Data at the SFB "Mehrsprachigkeit" K1: Japanese and German expert discourse in

More information

Hybrid Strategies. for better products and shorter time-to-market

Hybrid Strategies. for better products and shorter time-to-market Hybrid Strategies for better products and shorter time-to-market Background Manufacturer of language technology software & services Spin-off of the research center of Germany/Heidelberg Founded in 1999,

More information

Dublin City University at CLEF 2004: Experiments with the ImageCLEF St Andrew s Collection

Dublin City University at CLEF 2004: Experiments with the ImageCLEF St Andrew s Collection Dublin City University at CLEF 2004: Experiments with the ImageCLEF St Andrew s Collection Gareth J. F. Jones, Declan Groves, Anna Khasin, Adenike Lam-Adesina, Bart Mellebeek. Andy Way School of Computing,

More information

UEdin: Translating L1 Phrases in L2 Context using Context-Sensitive SMT

UEdin: Translating L1 Phrases in L2 Context using Context-Sensitive SMT UEdin: Translating L1 Phrases in L2 Context using Context-Sensitive SMT Eva Hasler ILCC, School of Informatics University of Edinburgh e.hasler@ed.ac.uk Abstract We describe our systems for the SemEval

More information

Improving MT System Using Extracted Parallel Fragments of Text from Comparable Corpora

Improving MT System Using Extracted Parallel Fragments of Text from Comparable Corpora mproving MT System Using Extracted Parallel Fragments of Text from Comparable Corpora Rajdeep Gupta, Santanu Pal, Sivaji Bandyopadhyay Department of Computer Science & Engineering Jadavpur University Kolata

More information

REPORT ON THE WORKBENCH FOR DEVELOPERS

REPORT ON THE WORKBENCH FOR DEVELOPERS REPORT ON THE WORKBENCH FOR DEVELOPERS for developers DELIVERABLE D3.2 VERSION 1.3 2015 JUNE 15 QTLeap Machine translation is a computational procedure that seeks to provide the translation of utterances

More information

NLP Programming Tutorial 5 - Part of Speech Tagging with Hidden Markov Models

NLP Programming Tutorial 5 - Part of Speech Tagging with Hidden Markov Models NLP Programming Tutorial 5 - Part of Speech Tagging with Hidden Markov Models Graham Neubig Nara Institute of Science and Technology (NAIST) 1 Part of Speech (POS) Tagging Given a sentence X, predict its

More information

Machine Translation for Human Translators

Machine Translation for Human Translators Machine Translation for Human Translators Michael Denkowski CMU-LTI-15-004 Language Technologies Institute School of Computer Science Carnegie Mellon University 5000 Forbes Ave., Pittsburgh, PA 15213 www.lti.cs.cmu.edu

More information

International Guest Students APPLICATION FORM

International Guest Students APPLICATION FORM International Guest Students APPLICATION FORM Applying as an international guest student at Hamburg University of Applied Sciences The academic year is split into two semesters. You can apply to come for

More information

Word Completion and Prediction in Hebrew

Word Completion and Prediction in Hebrew Experiments with Language Models for בס"ד Word Completion and Prediction in Hebrew 1 Yaakov HaCohen-Kerner, Asaf Applebaum, Jacob Bitterman Department of Computer Science Jerusalem College of Technology

More information

Language Model of Parsing and Decoding

Language Model of Parsing and Decoding Syntax-based Language Models for Statistical Machine Translation Eugene Charniak ½, Kevin Knight ¾ and Kenji Yamada ¾ ½ Department of Computer Science, Brown University ¾ Information Sciences Institute,

More information

Statistical Pattern-Based Machine Translation with Statistical French-English Machine Translation

Statistical Pattern-Based Machine Translation with Statistical French-English Machine Translation Statistical Pattern-Based Machine Translation with Statistical French-English Machine Translation Jin'ichi Murakami, Takuya Nishimura, Masato Tokuhisa Tottori University, Japan Problems of Phrase-Based

More information

Elena Chiocchetti & Natascia Ralli (EURAC) Tanja Wissik & Vesna Lušicky (University of Vienna)

Elena Chiocchetti & Natascia Ralli (EURAC) Tanja Wissik & Vesna Lušicky (University of Vienna) Elena Chiocchetti & Natascia Ralli (EURAC) Tanja Wissik & Vesna Lušicky (University of Vienna) VII Conference on Legal Translation, Court Interpreting and Comparative Legilinguistics Poznań, 28-30.06.2013

More information

Student Booklet. Name.. Form..

Student Booklet. Name.. Form.. Student Booklet Name.. Form.. 2012 Contents Page Introduction 3 Teaching Staff 3 Expectations 3 Speaking German 3 Organisation 3 Self Study 4 Course Details and Contents 5/6 Bridging the gap and quiz 7/8

More information

Domain-specific terminology extraction for Machine Translation. Mihael Arcan

Domain-specific terminology extraction for Machine Translation. Mihael Arcan Domain-specific terminology extraction for Machine Translation Mihael Arcan Outline Phd topic Introduction Resources Tools Multi Word Extraction (MWE) extraction Projection of MWE Evaluation Future Work

More information

IRIS - English-Irish Translation System

IRIS - English-Irish Translation System IRIS - English-Irish Translation System Mihael Arcan, Unit for Natural Language Processing of the Insight Centre for Data Analytics at the National University of Ireland, Galway Introduction about me,

More information

Scrambling in German - Extraction into the Mittelfeld

Scrambling in German - Extraction into the Mittelfeld Scrambling in German - Extraction into the Mittelfeld Stefan Mailer* Humboldt Universitat zu Berlin August, 1995 Abstract German is a language with a relatively free word order. During the last few years

More information

The Transition of Phrase based to Factored based Translation for Tamil language in SMT Systems

The Transition of Phrase based to Factored based Translation for Tamil language in SMT Systems The Transition of Phrase based to Factored based Translation for Tamil language in SMT Systems Dr. Ananthi Sheshasaayee 1, Angela Deepa. V.R 2 1 Research Supervisior, Department of Computer Science & Application,

More information

Scaling Shrinkage-Based Language Models

Scaling Shrinkage-Based Language Models Scaling Shrinkage-Based Language Models Stanley F. Chen, Lidia Mangu, Bhuvana Ramabhadran, Ruhi Sarikaya, Abhinav Sethy IBM T.J. Watson Research Center P.O. Box 218, Yorktown Heights, NY 10598 USA {stanchen,mangu,bhuvana,sarikaya,asethy}@us.ibm.com

More information

Motivation. Korpus-Abfrage: Werkzeuge und Sprachen. Overview. Languages of Corpus Query. SARA Query Possibilities 1

Motivation. Korpus-Abfrage: Werkzeuge und Sprachen. Overview. Languages of Corpus Query. SARA Query Possibilities 1 Korpus-Abfrage: Werkzeuge und Sprachen Gastreferat zur Vorlesung Korpuslinguistik mit und für Computerlinguistik Charlotte Merz 3. Dezember 2002 Motivation Lizentiatsarbeit: A Corpus Query Tool for Automatically

More information

International Guest Students APPLICATION FORM

International Guest Students APPLICATION FORM International Guest Students APPLICATION FORM Applying as an international guest student at Hamburg University of Applied Sciences The academic year is split into two semesters. You can apply to come for

More information

The Prague Bulletin of Mathematical Linguistics NUMBER 93 JANUARY 2010 37 46. Training Phrase-Based Machine Translation Models on the Cloud

The Prague Bulletin of Mathematical Linguistics NUMBER 93 JANUARY 2010 37 46. Training Phrase-Based Machine Translation Models on the Cloud The Prague Bulletin of Mathematical Linguistics NUMBER 93 JANUARY 2010 37 46 Training Phrase-Based Machine Translation Models on the Cloud Open Source Machine Translation Toolkit Chaski Qin Gao, Stephan

More information

An Online Service for SUbtitling by MAchine Translation

An Online Service for SUbtitling by MAchine Translation SUMAT CIP-ICT-PSP-270919 An Online Service for SUbtitling by MAchine Translation Annual Public Report 2011 Editor(s): Contributor(s): Reviewer(s): Status-Version: Volha Petukhova, Arantza del Pozo Mirjam

More information

Statistical NLP Spring 2008. Machine Translation: Examples

Statistical NLP Spring 2008. Machine Translation: Examples Statistical NLP Spring 2008 Lecture 11: Word Alignment Dan Klein UC Berkeley Machine Translation: Examples 1 Machine Translation Madame la présidente, votre présidence de cette institution a été marquante.

More information

Multipurpsoe Business Partner Certificates Guideline for the Business Partner

Multipurpsoe Business Partner Certificates Guideline for the Business Partner Multipurpsoe Business Partner Certificates Guideline for the Business Partner 15.05.2013 Guideline for the Business Partner, V1.3 Document Status Document details Siemens Topic Project name Document type

More information

The United Nations Parallel Corpus v1.0

The United Nations Parallel Corpus v1.0 The United Nations Parallel Corpus v1.0 Michał Ziemski, Marcin Junczys-Dowmunt, Bruno Pouliquen United Nations, DGACM, New York, United States of America Adam Mickiewicz University, Poznań, Poland World

More information

Training and evaluation of POS taggers on the French MULTITAG corpus

Training and evaluation of POS taggers on the French MULTITAG corpus Training and evaluation of POS taggers on the French MULTITAG corpus A. Allauzen, H. Bonneau-Maynard LIMSI/CNRS; Univ Paris-Sud, Orsay, F-91405 {allauzen,maynard}@limsi.fr Abstract The explicit introduction

More information