default search action
7th WAT@AACL/IJCNLP 2020: Suzhou, China
- Toshiaki Nakazawa, Hideki Nakayama, Chenchen Ding, Raj Dabre, Anoop Kunchukuttan, Win Pa Pa, Ondrej Bojar, Shantipriya Parida, Isao Goto, Hidaya Mino, Hiroshi Manabe, Katsuhito Sudoh, Sadao Kurohashi, Pushpak Bhattacharyya:
Proceedings of the 7th Workshop on Asian Translation, WAT@AACL/IJCNLP 2020, Suzhou, China, December 4, 2020. Association for Computational Linguistics 2020, ISBN 978-1-952148-95-8 - Toshiaki Nakazawa, Hideki Nakayama, Chenchen Ding, Raj Dabre, Shohei Higashiyama, Hideya Mino, Isao Goto, Win Pa Pa, Anoop Kunchukuttan, Shantipriya Parida, Ondrej Bojar, Sadao Kurohashi:
Overview of the 7th Workshop on Asian Translation. 1-44 - Benyamin Ahmadnia, Bonnie J. Dorr:
An Effective Optimization Method for Neural Machine Translation: The Case of English-Persian Bilingually Low-Resource Scenario. 45-49 - Kenji Imamura, Eiichiro Sumita:
Transformer-based Double-token Bidirectional Autoregressive Decoding in Neural Machine Translation. 50-57 - Zizheng Zhang, Tosho Hirasawa, Wei Houjing, Masahiro Kaneko, Mamoru Komachi:
Translation of New Named Entities from English to Chinese. 58-63 - Zhuoyuan Mao, Yibin Shen, Chenhui Chu, Sadao Kurohashi, Cheqing Jin:
Meta Ensemble for Japanese-Chinese Neural Machine Translation: Kyoto-U+ECNU Participation to WAT 2020. 64-71 - Isao Goto, Hideya Mino, Hitoshi Ito, Kazutaka Kinugawa, Ichiro Yamada, Hideki Tanaka:
Neural Machine Translation Using Extracted Context Based on Deep Analysis for the Japanese-English Newswire Task at WAT 2020. 72-79 - Hiroto Tamura, Tosho Hirasawa, Masahiro Kaneko, Mamoru Komachi:
TMU Japanese-English Multimodal Machine Translation System for WAT 2020. 80-91 - Zhengzhe Yu, Zhanglin Wu, Xiaoyu Chen, Daimeng Wei, Hengchao Shang, Jiaxin Guo, Zongyao Li, Minghan Wang, Liangyou Li, Lizhi Lei, Hao Yang, Ying Qin:
HW-TSC's Participation in the WAT 2020 Indic Languages Multilingual Task. 92-97 - Raj Dabre, Abhisek Chakrabarty:
NICT's Submission To WAT 2020: How Effective Are Simple Many-To-Many Neural Machine Translation Models? 98-102 - Shantipriya Parida, Petr Motlícek, Amulya Ratna Dash, Satya Ranjan Dash, Debasish Kumar Mallick, Satya Prakash Biswal, Priyanka Pattnaik, Biranchi Narayan Nayak, Ondrej Bojar:
ODIANLP's Participation in WAT2020. 103-108 - Sahinur Rahman Laskar, Abdullah Faiz Ur Rahman Khilji, Partha Pakray, Sivaji Bandyopadhyay:
Multimodal Neural Machine Translation for English to Hindi. 109-113 - Prashanth Nayak, Rejwanul Haque, Andy Way:
The ADAPT Centre's Participation in WAT 2020 English-to-Odia Translation Task. 114-117 - Rupjyoti Baruah, Rajesh Kumar Mundotiya:
NLPRL Odia-English: Indic Language Neural Machine Translation System. 118-121 - Santanu Pal:
WT: Wipro AI Submissions to the WAT 2020. 122-126 - Hwichan Kim, Tosho Hirasawa, Mamoru Komachi:
Korean-to-Japanese Neural Machine Translation System using Hanja Information. 127-134 - Dongzhe Wang, Ohnmar Htun:
Goku's Participation in WAT 2020. 135-141 - Wandri Jooste, Rejwanul Haque, Andy Way:
The ADAPT Centre's Neural MT Systems for the WAT 2020 Document-Level Translation Task. 142-146 - Matiss Rikters, Toshiaki Nakazawa, Ryokan Ri:
The University of Tokyo's Submissions to the WAT 2020 Shared Task. 147-153 - Nikhil Jaiswal, Mayur Patidar, Surabhi Kumari, Manasi Patwardhan, Shirish Karande, Puneet Agarwal, Lovekesh Vig:
Improving NMT via Filtered Back Translation. 154-159 - Bianka Buschbeck, Miriam Exel:
A Parallel Evaluation Data Set of Software Documentation with Document Structure Annotation. 160-169 - Danielle Saunders, Weston Feely, Bill Byrne:
Inference-only sub-character decomposition improves translation of unseen logographic characters. 170-177 - Akshai Ramesh, Venkatesh Balavadhani Parthasarathy, Rejwanul Haque, Andy Way:
An Error-based Investigation of Statistical and Neural Machine Translation Performance on Hindi-to-Tamil and English-to-Tamil. 178-188
manage site settings
To protect your privacy, all features that rely on external API calls from your browser are turned off by default. You need to opt-in for them to become active. All settings here will be stored as cookies with your web browser. For more information see our F.A.Q.