Paragraph-based Transformer Pre-training for Multi-Sentence Inference (original) (raw)

Pre-training Transformer Models with Sentence-Level Objectives for Answer Sentence Selection

Luca Di Liello

arXiv (Cornell University), 2022

View PDFchevron_right

Context-Aware Transformer Pre-Training for Answer Sentence Selection

Luca Di Liello

Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)

View PDFchevron_right

Contextualized Embeddings based Transformer Encoder for Sentence Similarity Modeling in Answer Selection Task

Enamul Hoque

2020

View PDFchevron_right

Semantic Linking in Convolutional Neural Networks for Answer Sentence Selection

Alessandro Moschitti

Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, 2018

View PDFchevron_right

Cross-Pair Text Representations for Answer Sentence Selection

Alessandro Moschitti

Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, 2018

View PDFchevron_right

Inter-Sentence Features and Thresholded Minimum Error Rate Training: NAIST at CLEF 2013 QA4MRE

Philip Arthur

View PDFchevron_right

UoR at SemEval-2020 Task 4: Pre-trained Sentence Transformer Models for Commonsense Validation and Explanation

Bhuvana Dhruva

Proceedings of the Fourteenth Workshop on Semantic Evaluation

View PDFchevron_right

Block-Skim: Efficient Question Answering for Transformer

Jingwen Leng

ArXiv, 2021

View PDFchevron_right

The RepEval 2017 Shared Task: Multi-Genre Natural Language Inference with Sentence Representations

Adina Williams

Proceedings of the 2nd Workshop on Evaluating Vector Space Representations for NLP

View PDFchevron_right

Commonsense Statements Identification and Explanation with Transformer-based Encoders

Sonia - Teodora Cibu

Proceedings of Deep Learning Inside Out (DeeLIO): The First Workshop on Knowledge Extraction and Integration for Deep Learning Architectures, 2020

View PDFchevron_right

Representation biases in sentence transformers

Dmitry Nikolaev

Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics, 2023

View PDFchevron_right

End-to-End Transformer-Based Models in Textual-Based NLP

Abir Rahali

AI

View PDFchevron_right

Pentagon at MEDIQA 2019: Multi-task Learning for Filtering and Re-ranking Answers using Language Inference and Question Entailment

sheetal shalini

Proceedings of the 18th BioNLP Workshop and Shared Task, 2019

View PDFchevron_right

Universal Sentence Encoder

Sheng-yi Kong

2018

View PDFchevron_right

Ensemble Transformer for Efficient and Accurate Ranking Tasks: an Application to Question Answering Systems

Eric Lind

2022

View PDFchevron_right

A Contrastive Framework for Learning Sentence Representations from Pairwise and Triple-wise Perspective in Angular Space

Binqiang Zhao

Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)

View PDFchevron_right

BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding

Sench Galiedon

View PDFchevron_right

A Broad-Coverage Challenge Corpus for Sentence Understanding through Inference

Adina Williams

Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long Papers)

View PDFchevron_right

Span Selection Pre-training for Question Answering

Alfio Gliozzo

arXiv (Cornell University), 2019

View PDFchevron_right

Predicting and Integrating Expected Answer Types into a Simple Recurrent Neural Network Model for Answer Sentence Selection

Yue Ma

Computación y Sistemas

View PDFchevron_right

Learning General Purpose Distributed Sentence Representations via Large Scale Multi-task Learning

joseph pal

Cornell University - arXiv, 2018

View PDFchevron_right

Few-Shot Question Answering by Pretraining Span Selection

Yuval Kirstain

ArXiv, 2021

View PDFchevron_right

Evaluating Deep Learning Techniques for Natural Language Inference

Petros Eleftheriadis

Applied Sciences

View PDFchevron_right

Investigating semantic subspaces of Transformer sentence embeddings through linear structural probing

Dmitry Nikolaev

BlackboxNLP, 2023

View PDFchevron_right

Scalable Attentive Sentence Pair Modeling via Distilled Sentence Embedding

itzik malkiel

Proceedings of the AAAI Conference on Artificial Intelligence, 2020

View PDFchevron_right

Stress Test Evaluation of Transformer-based Models in Natural Language Understanding Tasks

Andres Carvallo

arXiv (Cornell University), 2020

View PDFchevron_right

Predicting scalar inferences from “or” to “not both” using neural sentence encoders

Elissa Yiyi Li

2021

View PDFchevron_right

Universal Sentence Encoder for English

Sheng-yi Kong

Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing: System Demonstrations

View PDFchevron_right

Attention-based Pairwise Multi-Perspective Convolutional Neural Network for Answer Selection in Question Answering

Mohammad Nematbakhsh

ArXiv, 2019

View PDFchevron_right

An attention-gated convolutional neural network for sentence classification

asman sadino

Intelligent Data Analysis, 2019

View PDFchevron_right

On Robustness of Finetuned Transformer-based NLP Models

Venkateswara Rao Kagita

arXiv (Cornell University), 2023

View PDFchevron_right

NLQuAD: A Non-Factoid Long Question Answering Data Set

Marcel Worring

Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume

View PDFchevron_right