default search action
WMT 2018: Belgium, Brussels
- Ondrej Bojar, Rajen Chatterjee, Christian Federmann, Mark Fishel, Yvette Graham, Barry Haddow, Matthias Huck, Antonio Jimeno-Yepes, Philipp Koehn, Christof Monz, Matteo Negri, Aurélie Névéol, Mariana L. Neves, Matt Post, Lucia Specia, Marco Turchi, Karin Verspoor:
Proceedings of the Third Conference on Machine Translation: Research Papers, WMT 2018, Belgium, Brussels, October 31 - November 1, 2018. Association for Computational Linguistics 2018, ISBN 978-1-948087-81-0 - Myle Ott, Sergey Edunov, David Grangier, Michael Auli:
Scaling Neural Machine Translation. 1-9 - Nikola I. Nikolov, Yuhuang Hu, Mi Xue Tan, Richard H. R. Hahnloser:
Character-level Chinese-English Translation through ASCII Encoding. 10-16 - Longtu Zhang, Mamoru Komachi:
Neural Machine Translation of Logographic Language Using Sub-character Level Information. 17-25 - Gongbo Tang, Rico Sennrich, Joakim Nivre:
An Analysis of Attention Mechanisms: The Case of Word Sense Disambiguation in Neural Machine Translation. 26-35 - Margita Sostaric, Christian Hardmeier, Sara Stymne:
Discourse-Related Language Contrasts in English-Croatian Human and Machine Translation. 36-48 - Dario Stojanovski, Alexander M. Fraser:
Coreference and Coherence in Neural Machine Translation: A Study Using Oracle Experiments. 49-60 - Mathias Müller, Annette Rios, Elena Voita, Rico Sennrich:
A Large-Scale Test Set for the Evaluation of Context-Aware Pronoun Translation in Neural Machine Translation. 61-72 - Nikolaos Pappas, Lesly Miculicich Werlen, James Henderson:
Beyond Weight Tying: Learning Joint Input-Output Embeddings for Neural Machine Translation. 73-83 - Yichao Lu, Phillip Keung, Faisal Ladhak, Vikas Bhardwaj, Shaonan Zhang, Jason Sun:
A neural interlingua for multilingual machine translation. 84-92 - Christian Herold, Yingbo Gao, Hermann Ney:
Improving Neural Language Models with Weight Norm Initialization and Regularization. 93-100 - Sameen Maruf, André F. T. Martins, Gholamreza Haffari:
Contextual Neural Model for Translating Bilingual Multi-Speaker Conversations. 101-112 - Antonio Toral, Sheila Castilho, Ke Hu, Andy Way:
Attaining the Unattainable? Reassessing Claims of Human Parity in Neural Machine Translation. 113-123 - Brian Thompson, Huda Khayrallah, Antonios Anastasopoulos, Arya D. McCarthy, Kevin Duh, Rebecca Marvin, Paul McNamee, Jeremy Gwinnup, Tim Anderson, Philipp Koehn:
Freezing Subnetworks to Analyze Domain Adaptation in Neural Machine Translation. 124-132 - Wei Wang, Taro Watanabe, Macduff Hughes, Tetsuji Nakagawa, Ciprian Chelba:
Denoising Neural Machine Translation Training with Trusted Data and Online Data Selection. 133-143 - Franck Burlot, François Yvon:
Using Monolingual Data in Neural Machine Translation: a Systematic Study. 144-155 - Surafel Melaku Lakew, Aliia Erofeeva, Marcello Federico:
Neural Machine Translation into Language Varieties. 156-164 - Mandy Guo, Qinlan Shen, Yinfei Yang, Heming Ge, Daniel Cer, Gustavo Hernández Ábrego, Keith Stevens, Noah Constant, Yun-Hsuan Sung, Brian Strope, Ray Kurzweil:
Effective Parallel Corpus Mining using Bilingual Sentence Embeddings. 165-176 - Tamer Alkhouli, Gabriel Bretschner, Hermann Ney:
On The Alignment Problem In Multi-Head Attention-Based Neural Machine Translation. 177-185 - Matt Post:
A Call for Clarity in Reporting BLEU Scores. 186-191 - Mikel L. Forcada, Carolina Scarton, Lucia Specia, Barry Haddow, Alexandra Birch:
Exploring gap filling as a cheaper alternative to reading comprehension questionnaires when evaluating machine translation for gisting. 192-203 - Felix Stahlberg, James Cross, Veselin Stoyanov:
Simple Fusion: Return of the Language Model. 204-211 - Kenton Murray, David Chiang:
Correcting Length Bias in Neural Machine Translation. 212-223 - Catarina Cruz Silva, Chao-Hong Liu, Alberto Poncelas, Andy Way:
Extracting In-domain Training Corpora for Neural Machine Translation Using Data Selection Methods. 224-231 - Zhong Zhou, Matthias Sperber, Alexander Waibel:
Massively Parallel Cross-Lingual Learning in Low-Resource Target Language Translation. 232-243 - Tom Kocmi, Ondrej Bojar:
Trivial Transfer Learning for Low-Resource Neural Machine Translation. 244-252 - Jindrich Libovický, Jindrich Helcl, David Marecek:
Input Combination Strategies for Multi-Source Transformer Decoder. 253-260 - Devendra Singh Sachan, Graham Neubig:
Parameter Sharing Methods for Multilingual Self-Attentional Translation Models. 261-271
manage site settings
To protect your privacy, all features that rely on external API calls from your browser are turned off by default. You need to opt-in for them to become active. All settings here will be stored as cookies with your web browser. For more information see our F.A.Q.