On the evolution of syntactic information encoded by BERT’s contextualized representations (original) (raw)
Related papers
What Does BERT Learn about the Structure of Language?
Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics
CxGBERT: BERT meets Construction Grammar
2020
A Primer in BERTology: What We Know About How BERT Works
Transactions of the Association for Computational Linguistics, 2020
Does BERT really agree ? Fine-grained Analysis of Lexical Dependence on a Syntactic Task
Findings of the Association for Computational Linguistics: ACL 2022
A Structural Probe for Finding Syntax in Word Representations
2019
Which Sentence Embeddings and Which Layers Encode Syntactic Structure?
2020
Do Attention Heads in BERT Track Syntactic Dependencies?
ArXiv, 2019
The argument-adjunct distinction in BERT: A FrameNet-based investigation
ICWS, 2023
The Limitations of Limited Context for Constituency Parsing
ArXiv, 2021
How much pretraining data do language models need to learn syntax?
2021
How Can BERT Help Lexical Semantics Tasks?
arXiv (Cornell University), 2019
Automatic acquisition and efficient representation of syntactic
On the status of deep syntactic structure
2003
Do Neural Language Models Show Preferences for Syntactic Formalisms?
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, 2020
arXiv (Cornell University), 2022
On the Language-specificity of Multilingual BERT and the Impact of Fine-tuning
Proceedings of the Fourth BlackboxNLP Workshop on Analyzing and Interpreting Neural Networks for NLP, 2021
Morphosyntactic probing of multilingual BERT models
Natural Language Engineering
A Review on BERT and Its Implementation in Various NLP Tasks
Advances in computer science research, 2023
The Universe of Utterances According to BERT
ICWS, 2023
Distilling Task-Specific Knowledge from BERT into Simple Neural Networks
MemBERT: Injecting Unstructured Knowledge into BERT
ArXiv, 2021
GiBERT: Enhancing BERT with Linguistic Information using a Lightweight Gated Injection Method
Findings of the Association for Computational Linguistics: EMNLP 2021, 2021
Rich Syntax from a Raw Corpus: Unsupervised Does It
What Does BERT Look at? An Analysis of BERT’s Attention
Proceedings of the 2019 ACL Workshop BlackboxNLP: Analyzing and Interpreting Neural Networks for NLP, 2019
Exploring Linguistic Properties of Monolingual BERTs with Typological Classification among Languages
arXiv (Cornell University), 2023
Decomposing and regenerating syntactic trees
Towards a dynamic constituency model of syntax
2008
On Losses for Modern Language Models
Stéphane Aroca-Ouellette, Frank Rudzicz
Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing, 2020
Parsing with Multilingual BERT, a Small Corpus, and a Small Treebank
Findings of the Association for Computational Linguistics: EMNLP 2020
Lessons Learned from Applying off-the-shelf BERT: There is no Silver Bullet
2020
Does Chinese BERT Encode Word Structure?
Proceedings of the 28th International Conference on Computational Linguistics
Semantics boosts syntax in artificial grammar learning tasks with recursion.
2012
Abduction, induction and memorizing in corpus-based parsing
ESSLLI-2002 Workshop on Machine Learning …, 2002