Optimizing Transformer for Low-Resource Neural Machine Translation (original) (raw)
Related papers
Proceedings of the 3rd Workshop on Neural Generation and Translation
Deep Neural Transformer Model for Mono and Multi Lingual Machine Translation
2021
Analyzing Architectures for Neural Machine Translation using Low Computational Resources
International Journal on Natural Language Computing
An in-depth Study of Neural Machine Translation Performance
2019
Efficient Neural Machine Translation for Low-Resource Languages via Exploiting Related Languages
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics: Student Research Workshop, 2020
Three-phase training to address data sparsity in Neural Machine Translation
2017
Investigation of Transformer-based Latent Attention Models for Neural Machine Translation
2020
Attentive fine-tuning of Transformers for Translation of low-resourced languages @LoResMT 2021
ArXiv, 2021
A Comparison of Transformer and Recurrent Neural Networks on Multilingual Neural Machine Translation
2018
Neural Machine Translation For Low Resource Languages NTU-AI6127: Project Final Report
Goyle Vakul, Utsa Chattopadhyay
Multilingual Neural Machine Translation for Low-Resource Languages
Italian Journal of Computational Linguistics, 2018
Neural machine translation for low-resource languages without parallel corpora
Machine Translation, 2017
AKAN-ENGLISH: TRANSFORMER FOR LOW RESOURCE TRANSLATION
International Computer Conference on Wavelet Active Media Technology and Information Processing (ICCWAMTIP), 2021
Transformers for Low-Resource Languages: Is Féidir Linn!
2021
Finding the Right Recipe for Low Resource Domain Adaptation in Neural Machine Translation
arXiv (Cornell University), 2022
Hypoformer: Hybrid Decomposition Transformer for Edge-friendly Neural Machine Translation
Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing
2020
ArXiv, 2020
Continual Mixed-Language Pre-Training for Extremely Low-Resource Neural Machine Translation
Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021, 2021
Data Cartography for Low-Resource Neural Machine Translation
Findings of the Association for Computational Linguistics: EMNLP 2022
Using Self-Training to Improve Back-Translation in Low Resource Neural Machine Translation
ArXiv, 2020
Low-Resource Translation as Language Modeling
2020
Adapting Transformer to End-to-End Spoken Language Translation
Interspeech 2019, 2019
arXiv (Cornell University), 2023
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
A Hybrid Approach for Improved Low Resource Neural Machine Translation using Monolingual Data
2020