A Joint Sequence Translation Model with Integrated Reordering
|
|
|
- Joel Cannon
- 10 years ago
- Views:
Transcription
1 A Joint Sequence Translation Model with Integrated Reordering Nadir Durrani, Helmut Schmid and Alexander Fraser Institute for Natural Language Processing University of Stuttgart
2 Introduction Generation of bilingual sentence pair through a sequence of operations Operation: Translate or Reorder P (E,F,A) = Probability of the operation sequence required to generate the bilingual sentence pair Extension of N-gram based SMT Sequence of operations rather than tuples Integrated reordering rather than source linearization + rule extraction
3 Example Er hat eine Pizza gegessen He has eaten a pizza
4 Example Er hat eine Pizza gegessen He has eaten a pizza Simultaneous generation of source and target Generation is done in order of the target sentence Reorder when the source words are not in the same order
5 Example Er hat eine Pizza gegessen He has eaten a pizza Operations Generate Er He Er He
6 Example Er hat eine Pizza gegessen He has eaten a pizza Operations Generate Er He Generate hat has Er hat He has
7 Example Er hat eine Pizza gegessen He has eaten a pizza Operations Generate Er He Generate hat has Er hat Insert gap He has
8 Example Er hat eine Pizza gegessen He has eaten a pizza Operations Generate Er He Generate hat has Er hat gegessen Insert gap Generate gegessen eaten He has eaten
9 Example Er hat eine Pizza gegessen He has eaten a pizza Operations Generate Er He Generate hat has Er hat gegessen Insert gap Generate gegessen eaten He has eaten Jump back
10 Example Er hat eine Pizza gegessen He has eaten a pizza Operations Generate Er He Generate hat has Er hat eine Insert gap Generate gegessen eaten Jump back He has eaten a Generate eine a gegessen
11 Example Er hat eine Pizza gegessen He has eaten a pizza Operations Generate Er He Generate hat has Insert gap Generate gegessen eaten Jump back Generate eine a Generate Pizza pizza Er hat eine Pizza gegessen He has eaten a pizza
12 Lexical Trigger Er hat gegessen He has eaten Generate Er He Generate hat has Insert Gap Generate gegessen eat Jump Back
13 Generalizing to Unseen Context Er hat einen Erdbeerkuchen gegessen He has eaten a strawberry cake Generate Er-He Generate hat-has Insert Gap Generate gegessen-eat Jump Back(1) Generate einen-a Generate Erdbeerkuchen strawberry cake
14 Generalizing to Unseen Context Er hat einen Erdbeerkuchen und eine Menge Butterkekse gegessen He has eaten a strawberry cake and a lot of butter cookies Generate Er He Generate hat has Insert Gap Generate gegessen eat Jump Back(1) Generate einen a Generate Erdbeerkuchen strawberry cake Generate und and Generate eine a Generate Menge lot of Generate Butterkekse butter cookies
15 Key Ideas - Contributions Reordering integrated into translation model Translation and reordering decisions influence each other Handles local and long distance reorderings in a unified manner An operation model that accounts for: Translation Reordering Source-side gaps Source word deletion Joint model with bilingual information (like N-gram SMT) No spurious phrasal segmentation (like N-gram SMT) No distortion limit
16 List of Operations 4 Translation Operations Generate (X,Y) Continue Source Cept Generate Identical Generate Source Only (X) 3 Reordering Operations Insert Gap Jump Back (N) Jump Forward
17 List of Operations 4 Translation Operations Generate (X,Y) Continue Source Cept Generate Identical Generate Source Only (X) Example Generate (gegessen, eaten) 3 Reordering Operations Insert Gap Jump Back (N) Jump Forward
18 List of Operations 4 Translation Operations Generate (X,Y) Continue Source Cept Generate Identical Generate Source Only (X) Example Generate (Inflationsraten, inflation rate) Inflationsraten Inflation rate 3 Reordering Operations Insert Gap Jump Back (N) Jump Forward
19 List of Operations 4 Translation Operations Generate (X,Y) Continue Source Cept Generate Identical Generate Source Only (X) 3 Reordering Operations Insert Gap Jump Back (N) Jump Forward Example kehrten zurück returned Generate (kehrten zurück, returned) Insert Gap Continue Source Cept
20 List of Operations 4 Translation Operations Generate (X,Y) Continue Source Cept Generate Identical Generate Source Only (X) Example Generate Identical instead of Generate (Portland, Portland) 3 Reordering Operations Insert Gap Jump Back (N) Jump Forward If count (Portland) = 1
21 List of Operations 4 Translation Operations Generate (X,Y) Continue Source Cept Generate Identical Generate Source Only (X) Example kommen Sie mit come with me Generate Source Only (Sie) 3 Reordering Operations Insert Gap Jump Back (N) Jump Forward
22 List of Operations 4 Translation Operations Generate (X,Y) Continue Source Cept Generate Identical Generate Source Only (X) Example über konkrete Zahlen nicht verhandeln wollen do not want to negotiate on specific figures 3 Reordering Operations Insert Gap Jump Back (N) Jump Forward Gap # 1 do not nicht
23 List of Operations 4 Translation Operations Generate (X,Y) Continue Source Cept Generate Identical Generate Source Only (X) Example über konkrete Zahlen nicht verhandeln wollen do not want to negotiate on specific figures 3 Reordering Operations Insert Gap Jump Back (N) Jump Forward Gap # 2 Gap # 1 nicht wollen do not want to Jump Back (1)!!!
24 List of Operations 4 Translation Operations Generate (X,Y) Continue Source Cept Generate Identical Generate Source Only (X) Example über konkrete Zahlen nicht verhandeln wollen do not want to negotiate on specific figures 3 Reordering Operations Insert Gap Jump Back (N) Jump Forward Gap # 1 do not want to negotiate nicht verhandeln wollen
25 List of Operations 4 Translation Operations Generate (X,Y) Continue Source Cept Generate Identical Generate Source Only (X) Example über konkrete Zahlen nicht verhandeln wollen do not want to negotiate on specific figures 3 Reordering Operations Insert Gap Jump Back (N) Jump Forward Gap # 1 nicht verhandeln wollen do not want to negotiate Jump Back (1)!!!
26 List of Operations 4 Translation Operations Generate (X,Y) Continue Source Cept Generate Identical Generate Source Only (X) 3 Reordering Operations Insert Gap Jump Back (N) Jump Forward über konkrete Zahlen nicht verhandeln wollen do not want to negotiate on specific figures
27 List of Operations 4 Translation Operations Generate (X,Y) Continue Source Cept Generate Identical Generate Source Only (X) Jump Forward!!! 3 Reordering Operations Insert Gap über konkrete Zahlen nicht verhandeln wollen Jump Back (N) Jump Forward do not want to negotiate on specific figures
28 List of Operations 4 Translation Operations Generate (X,Y) Continue Source Cept Generate Identical Generate Source Only (X) 3 Reordering Operations Insert Gap Jump Back (N) Jump Forward über konkrete Zahlen nicht verhandeln wollen. do not want to negotiate on specific figures.
29 Learning Phrases through Operation Sequences über konkrete Zahlen nicht verhandeln wollen do not want to negotiate on specific figures Phrase Pair : nicht verhandeln wollen ~ do not want to negotiate Generate (nicht, do not) Insert Gap Generate (wollen, want to) Jump Back(1) Generate (verhandeln, negotiate)
30 Model Joint-probability model over operation sequences
31 Search Search is defined as: Incorporating language model 5-gram for the language model (p LM ) 9-gram for operation model and prior probability (p pr ) Stack based beam decoder which uses operations
32 Other Features
33 Other Features Length Penalty : Counts the number of target words produced
34 Other Features Length Penalty : Counts the number of target words produced Deletion Penalty : Counts the number of source words deleted
35 Other Features Length Penalty : Counts the number of target words produced Deletion Penalty : Counts the number of source words deleted Gap Penalty : Counts the number of gaps inserted
36 Other Features Length Penalty : Counts the number of target words produced Deletion Penalty : Counts the number of source words deleted Gap Penalty : Counts the number of gaps inserted Open Gap Penalty : Number of open gaps, paid once per each translation operation
37 Other Features Length Penalty : Counts the number of target words produced Deletion Penalty : Counts the number of source words deleted Gap Penalty : Counts the number of gaps inserted Open Gap Penalty : Number of open gaps, paid once per each translation operation Reordering Distance : Distance from the last translated tuple
38 Other Features Length Penalty : Counts the number of target words produced Deletion Penalty : Counts the number of source words deleted Gap Penalty : Counts the number of gaps inserted Open Gap Penalty : Number of open gaps, paid once per each translation operation Reordering Distance : Distance from the last translated tuple Gap Width : Distance from the first open gap
39 Other Features Length Penalty : Counts the number of target words produced Deletion Penalty : Counts the number of source words deleted Gap Penalty : Counts the number of gaps inserted Open Gap Penalty : Number of open gaps, paid once per each translation operation Reordering Distance : Distance from the last translated tuple Gap Width : Distance from the first open gap Lexical Probabilities : Source-to-Target and Target-to-Source lexical translation probabilities
40 Experimental Setup Language Pairs: German, Spanish and French to English Data 4 th Version of the Europarl Corpus Bilingual Data: 200K parallel sentences (reduced version of WMT 09) ~74K News commentary + ~ 126K Europarl Monolingual Data: 500K = 300K from the monolingual corpus (news commentary) + 200K English side of bilingual corpus Standard WMT 2009 sets for tuning and testing
41 Training & Tuning Giza++ for word alignment Heuristic modification of alignments to remove target-side gaps and unaligned target words (see the paper for details) Convert word-aligned bilingual corpus into operation corpus (see paper for details) SRI-Toolkit to train n-gram language models Kneser-Ney Smoothing Parameter Tuning with Z-mert
42 Results Baseline: Moses (with lexicalized reordering) with defaults A 5-gram language model (same as ours) Two baselines with no distortion limit and using a reordering limit 6 Two variations of our system Using no reordering limit Using gap-width of 6 as a reordering limit
43 Using Non-Gappy Source Cepts Source German Spanish French Bl no-rl Bl rl Tw no-rl Tw rl Moses score without reordering limit drops by more than a BLEU point Our best system Tw no-rl gives Statistically significant results over Bl rl-6 for German and Spanish Comparable results for French
44 Gappy + Non-Gappy Source Cepts Source German Spanish French Tw no-rl Tw rl-6 Tw asg-no-rl Tw asg-rl
45 Why didn t Gappy-Cepts improve performance? Using all source gaps explodes the search space Source German Spanish French Gaps 965,515 1,705,156 1,473,798 No Gaps 256, , ,220 Number of tuples using 10-best translations
46 Why didn t Gappy-Cepts improve performance? Using all source gaps explodes the search space Source German Spanish French Gaps 965,515 1,705,156 1,473,798 No Gaps 256, , ,220 Number of tuples using 10-best translations Future cost is incorrectly estimated in case of gappy cepts Dynamic programming algorithm for calculation of bigger spans doesn t apply anymore Modification but still problematic when gappy cepts interleave
47 Heuristic Use only the gappy cepts with scores better than sum of their parts log prob(habe gemacht made) > log p(habe have) + log p(gemacht made) Source German Spanish French Gaps 965,515 1,705,156 1,473,798 No Gaps 256, , ,220 Heuristic 281, , ,869
48 With Gappy Source Cepts + Heuristic Source Tw asg-no-rl Tw asg-rl-6 German Spanish French Tw hsg-no-rl Tw hsg-rl
49 Summary Translation and Reordering are combined into a single generative story Handles long and short distance reordering identically Ability to learn phrases through operation sequence All possible reorderings (in contrast with N-gram SMT) Using bilingual context (like N-gram SMT) No spurious phrasal segmentation (like N-gram SMT) No distortion limit Compared with state-of-the-art Moses system Comparable results for French-to-English Significantly better results for German-to-English and Spanish-to-English
50 Thank you - Questions? Decoder and Corpus Conversion Algorithm available at:
51 Future Work Improving Future Cost estimate Using phrases instead of tuples for future cost estimation N-gram Model and Phrase-based decoding Source-side discontinuities Future cost estimation with gappy units Gappy Phrases Improve the model to better handle source gas Target-side discontinuities Target unaligned words (Generate Target Only (Y) Operation) Generalizing the operation model using a combination of POS tags and lexical items
52 Search and Future Cost Estimation The search problem is much harder than in PBSMT Larger beam needed to produce translations similar to PBSMT Example zum Beispiel for example vs zum for, Beispiel example Problem with future cost estimation Language model probability Phrase based : p(for) * p(example for) Our Model : p(for) * p(example) Future Cost for reordering operations Future Cost for features gap penalty, gap-width and reordering distance
53 Future Cost Estimation with Source-Side Gaps Future Cost estimation with source side gaps is problematic Future Cost for Bigger Spans cost (I,K) = min( cost (I,J) + cost (J+1,K) ) for all J in I K cost (1,8) = min ( { cost (1,1) + cost (2,8) }, {cost (1,2) + cost (3,7)},, {cost(1,7) + cost(8,8)}
54 Future Cost Estimation with Source-Side Gaps FC estimation with source side gaps is problematic Future Cost for Bigger Spans cost (I,K) = min( cost (I,J) + cost (J+1,K) ) for all J in I K cost (1,8) = min ( { cost (1,1) + cost (2,8) }, {cost (1,2) + cost (3,7)},, {cost(1,7) + cost(8,8)}
55 Future Cost Estimation with Source-Side Gaps FC estimation with source side gaps is problematic Future Cost for Bigger Spans cost (I,K) = min( cost (I,J) + cost (J+1,K) ) for all J in I K cost (1,8) = min ( { cost (1,1) + cost (2,8) }, {cost (1,2) + cost (3,7)},, {cost(1,7) + cost(8,8)}
56 Future Cost Estimation with Source-Side Gaps Does not work for cepts with gaps Best way to cover word 1,4 and 8 is through cept cost (1,8) =? After computation of cost (1,8) we do another pass to find min (cost (1,8), cost (2,3) + cost (5,7) + cost_of_cept ( )
57 Future Cost Estimation with Source-Side Gaps Does not work for cepts with gaps Best way to cover word 1,4 and 8 is through cept cost (1,8) =? After computation of cost (1,8) we do another pass to find min (cost (1,8), cost (2,3) + cost (5,7) + cost_of_cept ( )
58 Future Cost Estimation with Source-Side Gaps Does not work for cepts with gaps Best way to cover word 1,4 and 8 is through cept cost (1,8) =? After computation of cost (1,8) we do another pass to find min (cost (1,8), cost (2,3) + cost (5,7) + cost_of_cept ( )
59 Future Cost Estimation with Source-Side Gaps Does not work for cepts with gaps Best way to cover word 1,4 and 8 is through cept cost (1,8) =? min (cost (1,8), cost (2,3) + cost (5,7) + cost_of_cept ( ), cost (3,7) + cost_of_cept ( ))
60 Future Cost Estimation with Source-Side Gaps Still problematic when gappy Cepts interleave Example: Consider best way to cover 1 & 5 is through cept 1 5 Modification can not capture that best cost = cost_of_cept (1..5) + cost_of_cept( ) + cost (3,3) + cost (6,7)
61 Future Cost Estimation with Source-Side Gaps Gives incorrect cost if coverage vector already covers a word between the gappy cept Decoder has covered 3 Future cost estimate cost (1,2) + cost (4,8) is wrong The correct estimate is cost_of_cept (1 4 8) + cost (2,2) + cost (5,8) No efficient way to cover all possible permutations
62 Target Side Gaps & Unaligned Words Our model does not allow target-side gaps and target unaligned words Post-editing of alignments a 3 step process Step-I: Remove all target-side gaps For a gappy alignment, link to least frequent target word is identified A group of link that contain this word is retained Example A B C D U V W X Y Z Target Side Discontinuity!!
63 Target Side Gaps & Unaligned Words Our model does not allow target-side gaps and target unaligned words Post-editing of alignments a 3 step process Step-I: Remove all target-side gaps For a gappy alignment, link to least frequent target word is identified A group of link that contain this word is retained Example A B C D U V W X Y Z
64 Target Side Gaps & Unaligned Words Our model does not allow target-side gaps and target unaligned words Post-editing of alignments a 3 step process Step-I: Remove all target-side gaps For a gappy alignment, link to least frequent target word is identified A group of link that contain this word is retained Example A B C D U V W X Y Z No target side gaps but target unaligned words!!!
65 Continued After Step-I A B C D U V W X Y Z Step-II: Counting over the training corpus to find the attachment preference of a word Count (U,V) = 1 Count (W,X) = 1 Count (W,X) = 1 Count (X,Y) = 0.5 Count (Y,Z) = 0.5
66 Continued Step-III: Attached target-unaligned words to right or left based on the collected counts After Step-III A B C D U V W X Y Z
A Joint Sequence Translation Model with Integrated Reordering
A Joint Sequence Translation Model with Integrated Reordering Nadir Durrani Advisors: Alexander Fraser and Helmut Schmid Institute for Natural Language Processing University of Stuttgart Machine Translation
Chapter 5. Phrase-based models. Statistical Machine Translation
Chapter 5 Phrase-based models Statistical Machine Translation Motivation Word-Based Models translate words as atomic units Phrase-Based Models translate phrases as atomic units Advantages: many-to-many
HIERARCHICAL HYBRID TRANSLATION BETWEEN ENGLISH AND GERMAN
HIERARCHICAL HYBRID TRANSLATION BETWEEN ENGLISH AND GERMAN Yu Chen, Andreas Eisele DFKI GmbH, Saarbrücken, Germany May 28, 2010 OUTLINE INTRODUCTION ARCHITECTURE EXPERIMENTS CONCLUSION SMT VS. RBMT [K.
Convergence of Translation Memory and Statistical Machine Translation
Convergence of Translation Memory and Statistical Machine Translation Philipp Koehn and Jean Senellart 4 November 2010 Progress in Translation Automation 1 Translation Memory (TM) translators store past
Chapter 6. Decoding. Statistical Machine Translation
Chapter 6 Decoding Statistical Machine Translation Decoding We have a mathematical model for translation p(e f) Task of decoding: find the translation e best with highest probability Two types of error
Phrase-Based MT. Machine Translation Lecture 7. Instructor: Chris Callison-Burch TAs: Mitchell Stern, Justin Chiu. Website: mt-class.
Phrase-Based MT Machine Translation Lecture 7 Instructor: Chris Callison-Burch TAs: Mitchell Stern, Justin Chiu Website: mt-class.org/penn Translational Equivalence Er hat die Prüfung bestanden, jedoch
Machine Translation. Agenda
Agenda Introduction to Machine Translation Data-driven statistical machine translation Translation models Parallel corpora Document-, sentence-, word-alignment Phrase-based translation MT decoding algorithm
SYSTRAN Chinese-English and English-Chinese Hybrid Machine Translation Systems for CWMT2011 SYSTRAN 混 合 策 略 汉 英 和 英 汉 机 器 翻 译 系 CWMT2011 技 术 报 告
SYSTRAN Chinese-English and English-Chinese Hybrid Machine Translation Systems for CWMT2011 Jin Yang and Satoshi Enoue SYSTRAN Software, Inc. 4444 Eastgate Mall, Suite 310 San Diego, CA 92121, USA E-mail:
Statistical Machine Translation
Statistical Machine Translation Some of the content of this lecture is taken from previous lectures and presentations given by Philipp Koehn and Andy Way. Dr. Jennifer Foster National Centre for Language
SYSTRAN 混 合 策 略 汉 英 和 英 汉 机 器 翻 译 系 统
SYSTRAN Chinese-English and English-Chinese Hybrid Machine Translation Systems Jin Yang, Satoshi Enoue Jean Senellart, Tristan Croiset SYSTRAN Software, Inc. SYSTRAN SA 9333 Genesee Ave. Suite PL1 La Grande
Statistical Machine Translation Lecture 4. Beyond IBM Model 1 to Phrase-Based Models
p. Statistical Machine Translation Lecture 4 Beyond IBM Model 1 to Phrase-Based Models Stephen Clark based on slides by Philipp Koehn p. Model 2 p Introduces more realistic assumption for the alignment
tance alignment and time information to create confusion networks 1 from the output of different ASR systems for the same
1222 IEEE TRANSACTIONS ON AUDIO, SPEECH, AND LANGUAGE PROCESSING, VOL. 16, NO. 7, SEPTEMBER 2008 System Combination for Machine Translation of Spoken and Written Language Evgeny Matusov, Student Member,
Computer Aided Translation
Computer Aided Translation Philipp Koehn 30 April 2015 Why Machine Translation? 1 Assimilation reader initiates translation, wants to know content user is tolerant of inferior quality focus of majority
Factored bilingual n-gram language models for statistical machine translation
Mach Translat DOI 10.1007/s10590-010-9082-5 Factored bilingual n-gram language models for statistical machine translation Josep M. Crego François Yvon Received: 2 November 2009 / Accepted: 12 June 2010
The Prague Bulletin of Mathematical Linguistics NUMBER 96 OCTOBER 2011 49 58. Ncode: an Open Source Bilingual N-gram SMT Toolkit
The Prague Bulletin of Mathematical Linguistics NUMBER 96 OCTOBER 2011 49 58 Ncode: an Open Source Bilingual N-gram SMT Toolkit Josep M. Crego a, François Yvon ab, José B. Mariño c c a LIMSI-CNRS, BP 133,
Factored Markov Translation with Robust Modeling
Factored Markov Translation with Robust Modeling Yang Feng Trevor Cohn Xinkai Du Information Sciences Institue Computing and Information Systems Computer Science Department The University of Melbourne
Machine Translation and the Translator
Machine Translation and the Translator Philipp Koehn 8 April 2015 About me 1 Professor at Johns Hopkins University (US), University of Edinburgh (Scotland) Author of textbook on statistical machine translation
The XMU Phrase-Based Statistical Machine Translation System for IWSLT 2006
The XMU Phrase-Based Statistical Machine Translation System for IWSLT 2006 Yidong Chen, Xiaodong Shi Institute of Artificial Intelligence Xiamen University P. R. China November 28, 2006 - Kyoto 13:46 1
An Iteratively-Trained Segmentation-Free Phrase Translation Model for Statistical Machine Translation
An Iteratively-Trained Segmentation-Free Phrase Translation Model for Statistical Machine Translation Robert C. Moore Chris Quirk Microsoft Research Redmond, WA 98052, USA {bobmoore,chrisq}@microsoft.com
Why Evaluation? Machine Translation. Evaluation. Evaluation Metrics. Ten Translations of a Chinese Sentence. How good is a given system?
Why Evaluation? How good is a given system? Machine Translation Evaluation Which one is the best system for our purpose? How much did we improve our system? How can we tune our system to become better?
Machine Translation. Why Evaluation? Evaluation. Ten Translations of a Chinese Sentence. Evaluation Metrics. But MT evaluation is a di cult problem!
Why Evaluation? How good is a given system? Which one is the best system for our purpose? How much did we improve our system? How can we tune our system to become better? But MT evaluation is a di cult
Collaborative Machine Translation Service for Scientific texts
Collaborative Machine Translation Service for Scientific texts Patrik Lambert [email protected] Jean Senellart Systran SA [email protected] Laurent Romary Humboldt Universität Berlin
Factored Translation Models
Factored Translation s Philipp Koehn and Hieu Hoang [email protected], [email protected] School of Informatics University of Edinburgh 2 Buccleuch Place, Edinburgh EH8 9LW Scotland, United Kingdom
An End-to-End Discriminative Approach to Machine Translation
An End-to-End Discriminative Approach to Machine Translation Percy Liang Alexandre Bouchard-Côté Dan Klein Ben Taskar Computer Science Division, EECS Department University of California at Berkeley Berkeley,
THUTR: A Translation Retrieval System
THUTR: A Translation Retrieval System Chunyang Liu, Qi Liu, Yang Liu, and Maosong Sun Department of Computer Science and Technology State Key Lab on Intelligent Technology and Systems National Lab for
Hybrid Machine Translation Guided by a Rule Based System
Hybrid Machine Translation Guided by a Rule Based System Cristina España-Bonet, Gorka Labaka, Arantza Díaz de Ilarraza, Lluís Màrquez Kepa Sarasola Universitat Politècnica de Catalunya University of the
Statistical Machine Translation
Statistical Machine Translation What works and what does not Andreas Maletti Universität Stuttgart [email protected] Stuttgart May 14, 2013 Statistical Machine Translation A. Maletti 1 Main
UNSUPERVISED MORPHOLOGICAL SEGMENTATION FOR STATISTICAL MACHINE TRANSLATION
UNSUPERVISED MORPHOLOGICAL SEGMENTATION FOR STATISTICAL MACHINE TRANSLATION by Ann Clifton B.A., Reed College, 2001 a thesis submitted in partial fulfillment of the requirements for the degree of Master
Domain-specific terminology extraction for Machine Translation. Mihael Arcan
Domain-specific terminology extraction for Machine Translation Mihael Arcan Outline Phd topic Introduction Resources Tools Multi Word Extraction (MWE) extraction Projection of MWE Evaluation Future Work
Machine Translation for Human Translators
Machine Translation for Human Translators Michael Denkowski CMU-LTI-15-004 Language Technologies Institute School of Computer Science Carnegie Mellon University 5000 Forbes Ave., Pittsburgh, PA 15213 www.lti.cs.cmu.edu
A New Input Method for Human Translators: Integrating Machine Translation Effectively and Imperceptibly
Proceedings of the Twenty-Fourth International Joint Conference on Artificial Intelligence (IJCAI 2015) A New Input Method for Human Translators: Integrating Machine Translation Effectively and Imperceptibly
UEdin: Translating L1 Phrases in L2 Context using Context-Sensitive SMT
UEdin: Translating L1 Phrases in L2 Context using Context-Sensitive SMT Eva Hasler ILCC, School of Informatics University of Edinburgh [email protected] Abstract We describe our systems for the SemEval
Automatic Speech Recognition and Hybrid Machine Translation for High-Quality Closed-Captioning and Subtitling for Video Broadcast
Automatic Speech Recognition and Hybrid Machine Translation for High-Quality Closed-Captioning and Subtitling for Video Broadcast Hassan Sawaf Science Applications International Corporation (SAIC) 7990
Systematic Comparison of Professional and Crowdsourced Reference Translations for Machine Translation
Systematic Comparison of Professional and Crowdsourced Reference Translations for Machine Translation Rabih Zbib, Gretchen Markiewicz, Spyros Matsoukas, Richard Schwartz, John Makhoul Raytheon BBN Technologies
Improving MT System Using Extracted Parallel Fragments of Text from Comparable Corpora
mproving MT System Using Extracted Parallel Fragments of Text from Comparable Corpora Rajdeep Gupta, Santanu Pal, Sivaji Bandyopadhyay Department of Computer Science & Engineering Jadavpur University Kolata
The Impact of Morphological Errors in Phrase-based Statistical Machine Translation from English and German into Swedish
The Impact of Morphological Errors in Phrase-based Statistical Machine Translation from English and German into Swedish Oscar Täckström Swedish Institute of Computer Science SE-16429, Kista, Sweden [email protected]
Building a Web-based parallel corpus and filtering out machinetranslated
Building a Web-based parallel corpus and filtering out machinetranslated text Alexandra Antonova, Alexey Misyurev Yandex 16, Leo Tolstoy St., Moscow, Russia {antonova, misyurev}@yandex-team.ru Abstract
Neural Machine Transla/on for Spoken Language Domains. Thang Luong IWSLT 2015 (Joint work with Chris Manning)
Neural Machine Transla/on for Spoken Language Domains Thang Luong IWSLT 2015 (Joint work with Chris Manning) Neural Machine Transla/on (NMT) End- to- end neural approach to MT: Simple and coherent. Achieved
Statistical Pattern-Based Machine Translation with Statistical French-English Machine Translation
Statistical Pattern-Based Machine Translation with Statistical French-English Machine Translation Jin'ichi Murakami, Takuya Nishimura, Masato Tokuhisa Tottori University, Japan Problems of Phrase-Based
Chinese-Japanese Machine Translation Exploiting Chinese Characters
Chinese-Japanese Machine Translation Exploiting Chinese Characters CHENHUI CHU, TOSHIAKI NAKAZAWA, DAISUKE KAWAHARA, and SADAO KUROHASHI, Kyoto University The Chinese and Japanese languages share Chinese
Statistical Machine Translation: IBM Models 1 and 2
Statistical Machine Translation: IBM Models 1 and 2 Michael Collins 1 Introduction The next few lectures of the course will be focused on machine translation, and in particular on statistical machine translation
Jane 2: Open Source Phrase-based and Hierarchical Statistical Machine Translation
Jane 2: Open Source Phrase-based and Hierarchical Statistical Machine Translation Joern Wuebker M at thias Huck Stephan Peitz M al te Nuhn M arkus F reitag Jan-Thorsten Peter Saab M ansour Hermann N e
Dublin City University at CLEF 2004: Experiments with the ImageCLEF St Andrew s Collection
Dublin City University at CLEF 2004: Experiments with the ImageCLEF St Andrew s Collection Gareth J. F. Jones, Declan Groves, Anna Khasin, Adenike Lam-Adesina, Bart Mellebeek. Andy Way School of Computing,
Introduction. Philipp Koehn. 28 January 2016
Introduction Philipp Koehn 28 January 2016 Administrativa 1 Class web site: http://www.mt-class.org/jhu/ Tuesdays and Thursdays, 1:30-2:45, Hodson 313 Instructor: Philipp Koehn (with help from Matt Post)
An Online Service for SUbtitling by MAchine Translation
SUMAT CIP-ICT-PSP-270919 An Online Service for SUbtitling by MAchine Translation Annual Public Report 2011 Editor(s): Contributor(s): Reviewer(s): Status-Version: Volha Petukhova, Arantza del Pozo Mirjam
Hybrid Strategies. for better products and shorter time-to-market
Hybrid Strategies for better products and shorter time-to-market Background Manufacturer of language technology software & services Spin-off of the research center of Germany/Heidelberg Founded in 1999,
Customizing an English-Korean Machine Translation System for Patent Translation *
Customizing an English-Korean Machine Translation System for Patent Translation * Sung-Kwon Choi, Young-Gil Kim Natural Language Processing Team, Electronics and Telecommunications Research Institute,
Adaptive Development Data Selection for Log-linear Model in Statistical Machine Translation
Adaptive Development Data Selection for Log-linear Model in Statistical Machine Translation Mu Li Microsoft Research Asia [email protected] Dongdong Zhang Microsoft Research Asia [email protected]
Learning Translations of Named-Entity Phrases from Parallel Corpora
Learning Translations of Named-Entity Phrases from Parallel Corpora Robert C. Moore Microsoft Research Redmond, WA 98052, USA [email protected] Abstract We develop a new approach to learning phrase
Chapter 7. Language models. Statistical Machine Translation
Chapter 7 Language models Statistical Machine Translation Language models Language models answer the question: How likely is a string of English words good English? Help with reordering p lm (the house
Automatic Mining of Internet Translation Reference Knowledge Based on Multiple Search Engines
, 22-24 October, 2014, San Francisco, USA Automatic Mining of Internet Translation Reference Knowledge Based on Multiple Search Engines Baosheng Yin, Wei Wang, Ruixue Lu, Yang Yang Abstract With the increasing
Applying Statistical Post-Editing to. English-to-Korean Rule-based Machine Translation System
Applying Statistical Post-Editing to English-to-Korean Rule-based Machine Translation System Ki-Young Lee and Young-Gil Kim Natural Language Processing Team, Electronics and Telecommunications Research
Comprendium Translator System Overview
Comprendium System Overview May 2004 Table of Contents 1. INTRODUCTION...3 2. WHAT IS MACHINE TRANSLATION?...3 3. THE COMPRENDIUM MACHINE TRANSLATION TECHNOLOGY...4 3.1 THE BEST MT TECHNOLOGY IN THE MARKET...4
Tuning Methods in Statistical Machine Translation
A thesis submitted in partial fulfilment for the degree of Master of Science in the science of Artificial Intelligence Tuning Methods in Statistical Machine Translation Author: Anne Gerard Schuth [email protected]
An Approach to Handle Idioms and Phrasal Verbs in English-Tamil Machine Translation System
An Approach to Handle Idioms and Phrasal Verbs in English-Tamil Machine Translation System Thiruumeni P G, Anand Kumar M Computational Engineering & Networking, Amrita Vishwa Vidyapeetham, Coimbatore,
The United Nations Parallel Corpus v1.0
The United Nations Parallel Corpus v1.0 Michał Ziemski, Marcin Junczys-Dowmunt, Bruno Pouliquen United Nations, DGACM, New York, United States of America Adam Mickiewicz University, Poznań, Poland World
An Online Service for SUbtitling by MAchine Translation
SUMAT CIP-ICT-PSP-270919 An Online Service for SUbtitling by MAchine Translation Annual Public Report 2012 Editor(s): Contributor(s): Reviewer(s): Status-Version: Arantza del Pozo Mirjam Sepesy Maucec,
The Transition of Phrase based to Factored based Translation for Tamil language in SMT Systems
The Transition of Phrase based to Factored based Translation for Tamil language in SMT Systems Dr. Ananthi Sheshasaayee 1, Angela Deepa. V.R 2 1 Research Supervisior, Department of Computer Science & Application,
Statistical NLP Spring 2008. Machine Translation: Examples
Statistical NLP Spring 2008 Lecture 11: Word Alignment Dan Klein UC Berkeley Machine Translation: Examples 1 Machine Translation Madame la présidente, votre présidence de cette institution a été marquante.
The Prague Bulletin of Mathematical Linguistics NUMBER 93 JANUARY 2010 37 46. Training Phrase-Based Machine Translation Models on the Cloud
The Prague Bulletin of Mathematical Linguistics NUMBER 93 JANUARY 2010 37 46 Training Phrase-Based Machine Translation Models on the Cloud Open Source Machine Translation Toolkit Chaski Qin Gao, Stephan
A Mixed Trigrams Approach for Context Sensitive Spell Checking
A Mixed Trigrams Approach for Context Sensitive Spell Checking Davide Fossati and Barbara Di Eugenio Department of Computer Science University of Illinois at Chicago Chicago, IL, USA [email protected], [email protected]
Collecting Polish German Parallel Corpora in the Internet
Proceedings of the International Multiconference on ISSN 1896 7094 Computer Science and Information Technology, pp. 285 292 2007 PIPS Collecting Polish German Parallel Corpora in the Internet Monika Rosińska
Getting Off to a Good Start: Best Practices for Terminology
Getting Off to a Good Start: Best Practices for Terminology Technologies for term bases, term extraction and term checks Angelika Zerfass, [email protected] Tools in the Terminology Life Cycle Extraction
Deciphering Foreign Language
Deciphering Foreign Language NLP 1! Sujith Ravi and Kevin Knight [email protected], [email protected] Information Sciences Institute University of Southern California! 2 Statistical Machine Translation (MT) Current
Factored Language Models for Statistical Machine Translation
Factored Language Models for Statistical Machine Translation Amittai E. Axelrod E H U N I V E R S I T Y T O H F G R E D I N B U Master of Science by Research Institute for Communicating and Collaborative
Predicting the Stock Market with News Articles
Predicting the Stock Market with News Articles Kari Lee and Ryan Timmons CS224N Final Project Introduction Stock market prediction is an area of extreme importance to an entire industry. Stock price is
Segmentation and Punctuation Prediction in Speech Language Translation Using a Monolingual Translation System
Segmentation and Punctuation Prediction in Speech Language Translation Using a Monolingual Translation System Eunah Cho, Jan Niehues and Alex Waibel International Center for Advanced Communication Technologies
Automatic Generation of Bid Phrases for Online Advertising
Automatic Generation of Bid Phrases for Online Advertising Sujith Ravi, Andrei Broder, Evgeniy Gabrilovich, Vanja Josifovski, Sandeep Pandey, Bo Pang ISI/USC, 4676 Admiralty Way, Suite 1001, Marina Del
Search and Information Retrieval
Search and Information Retrieval Search on the Web 1 is a daily activity for many people throughout the world Search and communication are most popular uses of the computer Applications involving search
Pre-processing of Bilingual Corpora for Mandarin-English EBMT
Pre-processing of Bilingual Corpora for Mandarin-English EBMT Ying Zhang, Ralf Brown, Robert Frederking, Alon Lavie Language Technologies Institute, Carnegie Mellon University NSH, 5000 Forbes Ave. Pittsburgh,
Translation Solution for
Translation Solution for Case Study Contents PROMT Translation Solution for PayPal Case Study 1 Contents 1 Summary 1 Background for Using MT at PayPal 1 PayPal s Initial Requirements for MT Vendor 2 Business
Dutch Parallel Corpus
Dutch Parallel Corpus Lieve Macken [email protected] LT 3, Language and Translation Technology Team Faculty of Applied Language Studies University College Ghent November 29th 2011 Lieve Macken (LT
