Optimizing Transformer for Low-Resource Neural Machine Translation (original) (raw)

Auto-Sizing the Transformer Network: Improving Speed, Efficiency, and Performance for Low-Resource Machine Translation

Kenton Murray

Proceedings of the 3rd Workshop on Neural Generation and Translation

View PDFchevron_right

Deep Neural Transformer Model for Mono and Multi Lingual Machine Translation

Abdelouahab MOUSSAOUI

2021

View PDFchevron_right

Analyzing Architectures for Neural Machine Translation using Low Computational Resources

Onkar Litake

International Journal on Natural Language Computing

View PDFchevron_right

An in-depth Study of Neural Machine Translation Performance

Oğuz Ergin

2019

View PDFchevron_right

Efficient Neural Machine Translation for Low-Resource Languages via Exploiting Related Languages

sourav kumar

Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics: Student Research Workshop, 2020

View PDFchevron_right

Three-phase training to address data sparsity in Neural Machine Translation

mihir shekhar

2017

View PDFchevron_right

Investigation of Transformer-based Latent Attention Models for Neural Machine Translation

Nikita Makarov

2020

View PDFchevron_right

Attentive fine-tuning of Transformers for Translation of low-resourced languages @LoResMT 2021

Karthik Puranik

ArXiv, 2021

View PDFchevron_right

A Comparison of Transformer and Recurrent Neural Networks on Multilingual Neural Machine Translation

Mauro Cettolo

2018

View PDFchevron_right

Neural Machine Translation For Low Resource Languages NTU-AI6127: Project Final Report

Goyle Vakul, Utsa Chattopadhyay

View PDFchevron_right

Multilingual Neural Machine Translation for Low-Resource Languages

Mesay Gemeda

Italian Journal of Computational Linguistics, 2018

View PDFchevron_right

Neural machine translation for low-resource languages without parallel corpora

Josef Van Genabith

Machine Translation, 2017

View PDFchevron_right

AKAN-ENGLISH: TRANSFORMER FOR LOW RESOURCE TRANSLATION

emmanuel agyei

International Computer Conference on Wavelet Active Media Technology and Information Processing (ICCWAMTIP), 2021

View PDFchevron_right

Transformers for Low-Resource Languages: Is Féidir Linn!

Seamus Lankford

2021

View PDFchevron_right

Finding the Right Recipe for Low Resource Domain Adaptation in Neural Machine Translation

Virginia Adams

arXiv (Cornell University), 2022

View PDFchevron_right

Hypoformer: Hybrid Decomposition Transformer for Edge-friendly Neural Machine Translation

Junqiu Wei

Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing

View PDFchevron_right

A More Comprehensive Method for Using The Target-side Monolingual Data to Improve Low Resource Neural Machine Translation

Abubakar Isa

2020

View PDFchevron_right

XLM-T: Scaling up Multilingual Machine Translation with Pretrained Cross-lingual Transformer Encoders

Arul Menezes

ArXiv, 2020

View PDFchevron_right

Continual Mixed-Language Pre-Training for Extremely Low-Resource Neural Machine Translation

INDRA WINATA

Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021, 2021

View PDFchevron_right

Data Cartography for Low-Resource Neural Machine Translation

Aquia Richburg

Findings of the Association for Computational Linguistics: EMNLP 2022

View PDFchevron_right

Using Self-Training to Improve Back-Translation in Low Resource Neural Machine Translation

Abubakar Inuwa Isa

ArXiv, 2020

View PDFchevron_right

Low-Resource Translation as Language Modeling

BERKAN HIZIROGLU

2020

View PDFchevron_right

Adapting Transformer to End-to-End Spoken Language Translation

mattia di gangi

Interspeech 2019, 2019

View PDFchevron_right

Joint Dropout: Improving Generalizability in Low-Resource Neural Machine Translation through Phrase Pair Variables

Ali Araabi

arXiv (Cornell University), 2023

View PDFchevron_right

Rethinking Data Augmentation for Low-Resource Neural Machine Translation: A Multi-Task Learning Approach

Víctor M. Sánchez Cartagena

Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing

View PDFchevron_right

A Hybrid Approach for Improved Low Resource Neural Machine Translation using Monolingual Data

Bashir Galadanci

2020

View PDFchevron_right