Morphosyntactic probing of multilingual BERT models (original) (raw)
Related papers
Morph Call: Probing Morphosyntactic Content of Multilingual Transformers
Proceedings of the Third Workshop on Computational Typology and Multilingual NLP
Transformers on Multilingual Clause-Level Morphology
Cornell University - arXiv, 2022
Assessing the Syntactic Capabilities of Transformer-based Multilingual Language Models
2021
CMU-01 at the SIGMORPHON 2019 Shared Task on Crosslinguality and Context in Morphology
Proceedings of the 16th Workshop on Computational Research in Phonetics, Phonology, and Morphology, 2019
SIGMORPHON 2020 Shared Task 0: Typologically Diverse Morphological Inflection
2020
The MRL 2022 Shared Task on Multilingual Clause-level Morphology
Le Centre pour la Communication Scientifique Directe - HAL - Inria, 2022
Exploring Linguistic Properties of Monolingual BERTs with Typological Classification among Languages
arXiv (Cornell University), 2023
MorphPiece : Moving away from Statistical Language Representation
arXiv (Cornell University), 2023
Parsing with Multilingual BERT, a Small Corpus, and a Small Treebank
Findings of the Association for Computational Linguistics: EMNLP 2020
SIGMORPHON 2021 Shared Task on Morphological Reinflection: Generalization Across Languages
Proceedings of the 18th SIGMORPHON Workshop on Computational Research in Phonetics, Phonology, and Morphology, 2021
Parallel reverse treebanks for the discovery of morpho-syntactic markings
Proceedings of Treebanks …, 2006
Shaking Syntactic Trees on the Sesame Street: Multilingual Probing with Controllable Perturbations
ArXiv, 2021
arXiv (Cornell University), 2022
On the Language-specificity of Multilingual BERT and the Impact of Fine-tuning
Proceedings of the Fourth BlackboxNLP Workshop on Analyzing and Interpreting Neural Networks for NLP, 2021
ArXiv, 2018
Joint learning of morphology and syntax with cross-level contextual information flow
Natural Language Engineering, 2022
Stem-driven Language Models for Morphologically Rich Languages
ArXiv, 2019
2016
Does BERT really agree ? Fine-grained Analysis of Lexical Dependence on a Syntactic Task
Findings of the Association for Computational Linguistics: ACL 2022
Modeling Morphologically Rich Languages Using Split Words and Unstructured Dependencies
2009
arXiv (Cornell University), 2022
Investigating the effect of sub-word segmentation on the performance of transformer language models
arXiv (Cornell University), 2023
Do Attention Heads in BERT Track Syntactic Dependencies
arXiv (Cornell University), 2019
On the Prunability of Attention Heads in Multilingual BERT
ArXiv, 2021
CoNLL-SIGMORPHON 2017 Shared Task: Universal Morphological Reinflection in 52 Languages
Proceedings of the CoNLL SIGMORPHON 2017 Shared Task: Universal Morphological Reinflection
Probing for Multilingual Numerical Understanding in Transformer-Based Language Models
Proceedings of the Third BlackboxNLP Workshop on Analyzing and Interpreting Neural Networks for NLP
Proceedings of the 16th Workshop on Computational Research in Phonetics, Phonology, and Morphology