On the Language-specificity of Multilingual BERT and the Impact of Fine-tuning (original) (raw)

LT@Helsinki at SemEval-2020 Task 12: Multilingual or language-specific BERT?

Emily Öhman

2020

View PDFchevron_right

Exploring Linguistic Properties of Monolingual BERTs with Typological Classification among Languages

federico ranaldi

arXiv (Cornell University), 2023

View PDFchevron_right

On the Prunability of Attention Heads in Multilingual BERT

Madhura Pande

ArXiv, 2021

View PDFchevron_right

Morphosyntactic probing of multilingual BERT models

Andras Kornai

Natural Language Engineering

View PDFchevron_right

Cross-Linguistic Syntactic Difference in Multilingual BERT: How Good is It and How Does It Affect Transfer?

Jingting Ye

arXiv (Cornell University), 2022

View PDFchevron_right

Multilingual BERT Post-Pretraining Alignment

Abhishek Shah

Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, 2021

View PDFchevron_right

On the ability of monolingual models to learn language-agnostic representations

Leandro Rodrigues de Souza

ArXiv, 2021

View PDFchevron_right

Distilling the Knowledge of Romanian BERTs Using Multiple Teachers

Dan Ioan Tufis

arXiv (Cornell University), 2021

View PDFchevron_right

Parsing with Multilingual BERT, a Small Corpus, and a Small Treebank

Ethan Chau

Findings of the Association for Computational Linguistics: EMNLP 2020

View PDFchevron_right

Investigating Post-pretraining Representation Alignment for Cross-Lingual Question Answering

Fahim Faisal

Proceedings of the 3rd Workshop on Machine Reading for Question Answering, 2021

View PDFchevron_right

Maximal Multiverse Learning for Promoting Cross-Task Generalization of Fine-Tuned Language Models

itzik malkiel

Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume, 2021

View PDFchevron_right

mGPT: Few-Shot Learners Go Multilingual

Tatiana Shavrina

Cornell University - arXiv, 2022

View PDFchevron_right

Exploring Fine-tuning Techniques for Pre-trained Cross-lingual Models via Continual Learning

INDRA WINATA

ArXiv, 2020

View PDFchevron_right

Is BERT a Cross-Disciplinary Knowledge Learner? A Surprising Finding of Pre-trained Models’ Transferability

Wei-tsung Kao

Findings of the Association for Computational Linguistics: EMNLP 2021, 2021

View PDFchevron_right

Domain Effect Investigation for Bert Models Fine-Tuned on Different Text Categorization Tasks

Ferhat Bozkurt

Arabian Journal for Science and Engineering, 2023

View PDFchevron_right

XTREME: A Massively Multilingual Multi-task Benchmark for Evaluating Cross-lingual Generalization

junjie Hu

ArXiv, 2020

View PDFchevron_right

BERT Based Multilingual Machine Comprehension in English and Hindi

Somil Gupta

ArXiv, 2020

View PDFchevron_right

On the Effect of Pretraining Corpora on In-context Learning by a Large-scale Language Model

Jung-Woo Ha

Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies

View PDFchevron_right

BanglaBERT: Combating Embedding Barrier in Multilingual Models for Low-Resource Language Understanding

Md. Saiful Islam

2021

View PDFchevron_right

Assessing the Syntactic Capabilities of Transformer-based Multilingual Language Models

Laura Pérez Mayos

2021

View PDFchevron_right

Adversarial Neural Networks for Cross-lingual Sequence Tagging

Aliaksei Severyn

ArXiv, 2018

View PDFchevron_right

What Does BERT Learn about the Structure of Language?

Benoît Sagot

Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics

View PDFchevron_right

Does BERT really agree ? Fine-grained Analysis of Lexical Dependence on a Syntactic Task

Karim Lasri

Findings of the Association for Computational Linguistics: ACL 2022

View PDFchevron_right

An Interpretability Illusion for BERT

Fernanda Viégas

2021

View PDFchevron_right

Domain adaptation challenges of BERT in tokenization and sub-word representations of Out-of-Vocabulary words

vijendran venkoparao

Proceedings of the First Workshop on Insights from Negative Results in NLP, 2020

View PDFchevron_right

HUBERT Untangles BERT to Improve Transfer across NLP Tasks

Paul Smolensky

ArXiv, 2019

View PDFchevron_right

M-BERT: Injecting Multimodal Information in the BERT Structure

Kamrul Hasan

ArXiv, 2019

View PDFchevron_right

RoBERTa: A Robustly Optimized BERT Pretraining Approach

Naman Goyal

arXiv (Cornell University), 2019

View PDFchevron_right

An Analysis of Social Biases Present in BERT Variants Across Multiple Languages

Parishad BehnamGhader

Cornell University - arXiv, 2022

View PDFchevron_right

Augmenting BERT Carefully with Underrepresented Linguistic Features

Jekaterina Novikova

arXiv (Cornell University), 2020

View PDFchevron_right