The DeepMind Chinese–English Document Translation System at WMT2020 (2024)

Lei Yu,Laurent Sartran,Po-Sen Huang,Wojciech Stokowiec,Domenic Donato,Srivatsan Srinivasan,Alek Andreev,Wang Ling,Sona Mokra,Agustin Dal Lago,Yotam Doron,Susannah Young,Phil Blunsom,Chris Dyer

Abstract

This paper describes the DeepMind submission to the ChineseEnglish constrained data track of the WMT2020 Shared Task on News Translation. The submission employs a noisy channel factorization as the backbone of a document translation system. This approach allows the flexible combination of a number of independent component models which are further augmented with back-translation, distillation, fine-tuning with in-domain data, Monte-Carlo Tree Search decoding, and improved uncertainty estimation. In order to address persistent issues with the premature truncation of long sequences we included specialized length models and sentence segmentation techniques. Our final system provides a 9.9 BLEU points improvement over a baseline Transformer on our test set (newstest 2019).

Anthology ID:
2020.wmt-1.36
Volume:
Proceedings of the Fifth Conference on Machine Translation
Month:
November
Year:
2020
Address:
Online
Editors:
Loïc Barrault,Ondřej Bojar,Fethi Bougares,Rajen Chatterjee,Marta R. Costa-jussà,Christian Federmann,Mark Fishel,Alexander Fraser,Yvette Graham,Paco Guzman,Barry Haddow,Matthias Huck,Antonio Jimeno Yepes,Philipp Koehn,André Martins,Makoto Morishita,Christof Monz,Masaaki Nagata,Toshiaki Nakazawa,Matteo Negri
Venue:
WMT
SIG:
SIGMT
Publisher:
Association for Computational Linguistics
Note:
Pages:
326–337
Language:
URL:
https://aclanthology.org/2020.wmt-1.36
DOI:
Bibkey:
Cite (ACL):
Lei Yu, Laurent Sartran, Po-Sen Huang, Wojciech Stokowiec, Domenic Donato, Srivatsan Srinivasan, Alek Andreev, Wang Ling, Sona Mokra, Agustin Dal Lago, Yotam Doron, Susannah Young, Phil Blunsom, and Chris Dyer. 2020. The DeepMind Chinese–English Document Translation System at WMT2020. In Proceedings of the Fifth Conference on Machine Translation, pages 326–337, Online. Association for Computational Linguistics.
Cite (Informal):
The DeepMind Chinese–English Document Translation System at WMT2020 (Yu et al., WMT 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.wmt-1.36.pdf
Video:
https://slideslive.com/38939586
Export citation
  • BibTeX
  • MODS XML
  • Endnote
  • Preformatted
@inproceedings{yu-etal-2020-deepmind, title = "The {D}eep{M}ind {C}hinese{--}{E}nglish Document Translation System at {WMT}2020", author = "Yu, Lei and Sartran, Laurent and Huang, Po-Sen and Stokowiec, Wojciech and Donato, Domenic and Srinivasan, Srivatsan and Andreev, Alek and Ling, Wang and Mokra, Sona and Dal Lago, Agustin and Doron, Yotam and Young, Susannah and Blunsom, Phil and Dyer, Chris", editor = {Barrault, Lo{\"\i}c and Bojar, Ond{\v{r}}ej and Bougares, Fethi and Chatterjee, Rajen and Costa-juss{\`a}, Marta R. and Federmann, Christian and Fishel, Mark and Fraser, Alexander and Graham, Yvette and Guzman, Paco and Haddow, Barry and Huck, Matthias and Yepes, Antonio Jimeno and Koehn, Philipp and Martins, Andr{\'e} and Morishita, Makoto and Monz, Christof and Nagata, Masaaki and Nakazawa, Toshiaki and Negri, Matteo}, booktitle = "Proceedings of the Fifth Conference on Machine Translation", month = nov, year = "2020", address = "Online", publisher = "Association for Computational Linguistics", url = "https://aclanthology.org/2020.wmt-1.36", pages = "326--337", abstract = "This paper describes the DeepMind submission to the Chinese$\rightarrow$English constrained data track of the WMT2020 Shared Task on News Translation. The submission employs a noisy channel factorization as the backbone of a document translation system. This approach allows the flexible combination of a number of independent component models which are further augmented with back-translation, distillation, fine-tuning with in-domain data, Monte-Carlo Tree Search decoding, and improved uncertainty estimation. In order to address persistent issues with the premature truncation of long sequences we included specialized length models and sentence segmentation techniques. Our final system provides a 9.9 BLEU points improvement over a baseline Transformer on our test set (newstest 2019).",}

Download as File

<?xml version="1.0" encoding="UTF-8"?><modsCollection xmlns="http://www.loc.gov/mods/v3"><mods ID="yu-etal-2020-deepmind"> <titleInfo> <title>The DeepMind Chinese–English Document Translation System at WMT2020</title> </titleInfo> <name type="personal"> <namePart type="given">Lei</namePart> <namePart type="family">Yu</namePart> <role> <roleTerm authority="marcrelator" type="text">author</roleTerm> </role> </name> <name type="personal"> <namePart type="given">Laurent</namePart> <namePart type="family">Sartran</namePart> <role> <roleTerm authority="marcrelator" type="text">author</roleTerm> </role> </name> <name type="personal"> <namePart type="given">Po-Sen</namePart> <namePart type="family">Huang</namePart> <role> <roleTerm authority="marcrelator" type="text">author</roleTerm> </role> </name> <name type="personal"> <namePart type="given">Wojciech</namePart> <namePart type="family">Stokowiec</namePart> <role> <roleTerm authority="marcrelator" type="text">author</roleTerm> </role> </name> <name type="personal"> <namePart type="given">Domenic</namePart> <namePart type="family">Donato</namePart> <role> <roleTerm authority="marcrelator" type="text">author</roleTerm> </role> </name> <name type="personal"> <namePart type="given">Srivatsan</namePart> <namePart type="family">Srinivasan</namePart> <role> <roleTerm authority="marcrelator" type="text">author</roleTerm> </role> </name> <name type="personal"> <namePart type="given">Alek</namePart> <namePart type="family">Andreev</namePart> <role> <roleTerm authority="marcrelator" type="text">author</roleTerm> </role> </name> <name type="personal"> <namePart type="given">Wang</namePart> <namePart type="family">Ling</namePart> <role> <roleTerm authority="marcrelator" type="text">author</roleTerm> </role> </name> <name type="personal"> <namePart type="given">Sona</namePart> <namePart type="family">Mokra</namePart> <role> <roleTerm authority="marcrelator" type="text">author</roleTerm> </role> </name> <name type="personal"> <namePart type="given">Agustin</namePart> <namePart type="family">Dal Lago</namePart> <role> <roleTerm authority="marcrelator" type="text">author</roleTerm> </role> </name> <name type="personal"> <namePart type="given">Yotam</namePart> <namePart type="family">Doron</namePart> <role> <roleTerm authority="marcrelator" type="text">author</roleTerm> </role> </name> <name type="personal"> <namePart type="given">Susannah</namePart> <namePart type="family">Young</namePart> <role> <roleTerm authority="marcrelator" type="text">author</roleTerm> </role> </name> <name type="personal"> <namePart type="given">Phil</namePart> <namePart type="family">Blunsom</namePart> <role> <roleTerm authority="marcrelator" type="text">author</roleTerm> </role> </name> <name type="personal"> <namePart type="given">Chris</namePart> <namePart type="family">Dyer</namePart> <role> <roleTerm authority="marcrelator" type="text">author</roleTerm> </role> </name> <originInfo> <dateIssued>2020-11</dateIssued> </originInfo> <typeOfResource>text</typeOfResource> <relatedItem type="host"> <titleInfo> <title>Proceedings of the Fifth Conference on Machine Translation</title> </titleInfo> <name type="personal"> <namePart type="given">Loïc</namePart> <namePart type="family">Barrault</namePart> <role> <roleTerm authority="marcrelator" type="text">editor</roleTerm> </role> </name> <name type="personal"> <namePart type="given">Ondřej</namePart> <namePart type="family">Bojar</namePart> <role> <roleTerm authority="marcrelator" type="text">editor</roleTerm> </role> </name> <name type="personal"> <namePart type="given">Fethi</namePart> <namePart type="family">Bougares</namePart> <role> <roleTerm authority="marcrelator" type="text">editor</roleTerm> </role> </name> <name type="personal"> <namePart type="given">Rajen</namePart> <namePart type="family">Chatterjee</namePart> <role> <roleTerm authority="marcrelator" type="text">editor</roleTerm> </role> </name> <name type="personal"> <namePart type="given">Marta</namePart> <namePart type="given">R</namePart> <namePart type="family">Costa-jussà</namePart> <role> <roleTerm authority="marcrelator" type="text">editor</roleTerm> </role> </name> <name type="personal"> <namePart type="given">Christian</namePart> <namePart type="family">Federmann</namePart> <role> <roleTerm authority="marcrelator" type="text">editor</roleTerm> </role> </name> <name type="personal"> <namePart type="given">Mark</namePart> <namePart type="family">Fishel</namePart> <role> <roleTerm authority="marcrelator" type="text">editor</roleTerm> </role> </name> <name type="personal"> <namePart type="given">Alexander</namePart> <namePart type="family">Fraser</namePart> <role> <roleTerm authority="marcrelator" type="text">editor</roleTerm> </role> </name> <name type="personal"> <namePart type="given">Yvette</namePart> <namePart type="family">Graham</namePart> <role> <roleTerm authority="marcrelator" type="text">editor</roleTerm> </role> </name> <name type="personal"> <namePart type="given">Paco</namePart> <namePart type="family">Guzman</namePart> <role> <roleTerm authority="marcrelator" type="text">editor</roleTerm> </role> </name> <name type="personal"> <namePart type="given">Barry</namePart> <namePart type="family">Haddow</namePart> <role> <roleTerm authority="marcrelator" type="text">editor</roleTerm> </role> </name> <name type="personal"> <namePart type="given">Matthias</namePart> <namePart type="family">Huck</namePart> <role> <roleTerm authority="marcrelator" type="text">editor</roleTerm> </role> </name> <name type="personal"> <namePart type="given">Antonio</namePart> <namePart type="given">Jimeno</namePart> <namePart type="family">Yepes</namePart> <role> <roleTerm authority="marcrelator" type="text">editor</roleTerm> </role> </name> <name type="personal"> <namePart type="given">Philipp</namePart> <namePart type="family">Koehn</namePart> <role> <roleTerm authority="marcrelator" type="text">editor</roleTerm> </role> </name> <name type="personal"> <namePart type="given">André</namePart> <namePart type="family">Martins</namePart> <role> <roleTerm authority="marcrelator" type="text">editor</roleTerm> </role> </name> <name type="personal"> <namePart type="given">Makoto</namePart> <namePart type="family">Morishita</namePart> <role> <roleTerm authority="marcrelator" type="text">editor</roleTerm> </role> </name> <name type="personal"> <namePart type="given">Christof</namePart> <namePart type="family">Monz</namePart> <role> <roleTerm authority="marcrelator" type="text">editor</roleTerm> </role> </name> <name type="personal"> <namePart type="given">Masaaki</namePart> <namePart type="family">Nagata</namePart> <role> <roleTerm authority="marcrelator" type="text">editor</roleTerm> </role> </name> <name type="personal"> <namePart type="given">Toshiaki</namePart> <namePart type="family">Nakazawa</namePart> <role> <roleTerm authority="marcrelator" type="text">editor</roleTerm> </role> </name> <name type="personal"> <namePart type="given">Matteo</namePart> <namePart type="family">Negri</namePart> <role> <roleTerm authority="marcrelator" type="text">editor</roleTerm> </role> </name> <originInfo> <publisher>Association for Computational Linguistics</publisher> <place> <placeTerm type="text">Online</placeTerm> </place> </originInfo> <genre authority="marcgt">conference publication</genre> </relatedItem> <abstract>This paper describes the DeepMind submission to the Chinese\rightarrowEnglish constrained data track of the WMT2020 Shared Task on News Translation. The submission employs a noisy channel factorization as the backbone of a document translation system. This approach allows the flexible combination of a number of independent component models which are further augmented with back-translation, distillation, fine-tuning with in-domain data, Monte-Carlo Tree Search decoding, and improved uncertainty estimation. In order to address persistent issues with the premature truncation of long sequences we included specialized length models and sentence segmentation techniques. Our final system provides a 9.9 BLEU points improvement over a baseline Transformer on our test set (newstest 2019).</abstract> <identifier type="citekey">yu-etal-2020-deepmind</identifier> <location> <url>https://aclanthology.org/2020.wmt-1.36</url> </location> <part> <date>2020-11</date> <extent unit="page"> <start>326</start> <end>337</end> </extent> </part></mods></modsCollection>

Download as File

%0 Conference Proceedings%T The DeepMind Chinese–English Document Translation System at WMT2020%A Yu, Lei%A Sartran, Laurent%A Huang, Po-Sen%A Stokowiec, Wojciech%A Donato, Domenic%A Srinivasan, Srivatsan%A Andreev, Alek%A Ling, Wang%A Mokra, Sona%A Dal Lago, Agustin%A Doron, Yotam%A Young, Susannah%A Blunsom, Phil%A Dyer, Chris%Y Barrault, Loïc%Y Bojar, Ondřej%Y Bougares, Fethi%Y Chatterjee, Rajen%Y Costa-jussà, Marta R.%Y Federmann, Christian%Y Fishel, Mark%Y Fraser, Alexander%Y Graham, Yvette%Y Guzman, Paco%Y Haddow, Barry%Y Huck, Matthias%Y Yepes, Antonio Jimeno%Y Koehn, Philipp%Y Martins, André%Y Morishita, Makoto%Y Monz, Christof%Y Nagata, Masaaki%Y Nakazawa, Toshiaki%Y Negri, Matteo%S Proceedings of the Fifth Conference on Machine Translation%D 2020%8 November%I Association for Computational Linguistics%C Online%F yu-etal-2020-deepmind%X This paper describes the DeepMind submission to the Chinese\rightarrowEnglish constrained data track of the WMT2020 Shared Task on News Translation. The submission employs a noisy channel factorization as the backbone of a document translation system. This approach allows the flexible combination of a number of independent component models which are further augmented with back-translation, distillation, fine-tuning with in-domain data, Monte-Carlo Tree Search decoding, and improved uncertainty estimation. In order to address persistent issues with the premature truncation of long sequences we included specialized length models and sentence segmentation techniques. Our final system provides a 9.9 BLEU points improvement over a baseline Transformer on our test set (newstest 2019).%U https://aclanthology.org/2020.wmt-1.36%P 326-337

Download as File

Markdown (Informal)

[The DeepMind Chinese–English Document Translation System at WMT2020](https://aclanthology.org/2020.wmt-1.36) (Yu et al., WMT 2020)

  • The DeepMind Chinese–English Document Translation System at WMT2020 (Yu et al., WMT 2020)
ACL
  • Lei Yu, Laurent Sartran, Po-Sen Huang, Wojciech Stokowiec, Domenic Donato, Srivatsan Srinivasan, Alek Andreev, Wang Ling, Sona Mokra, Agustin Dal Lago, Yotam Doron, Susannah Young, Phil Blunsom, and Chris Dyer. 2020. The DeepMind Chinese–English Document Translation System at WMT2020. In Proceedings of the Fifth Conference on Machine Translation, pages 326–337, Online. Association for Computational Linguistics.
The DeepMind Chinese–English Document Translation System at WMT2020 (2024)
Top Articles
Latest Posts
Recommended Articles
Article information

Author: Zonia Mosciski DO

Last Updated:

Views: 5607

Rating: 4 / 5 (71 voted)

Reviews: 94% of readers found this page helpful

Author information

Name: Zonia Mosciski DO

Birthday: 1996-05-16

Address: Suite 228 919 Deana Ford, Lake Meridithberg, NE 60017-4257

Phone: +2613987384138

Job: Chief Retail Officer

Hobby: Tai chi, Dowsing, Poi, Letterboxing, Watching movies, Video gaming, Singing

Introduction: My name is Zonia Mosciski DO, I am a enchanting, joyous, lovely, successful, hilarious, tender, outstanding person who loves writing and wants to share my knowledge and understanding with you.