Query Focused Abstractive Summarization via Incorporating Query Relevance and Transfer Learning with Transformer Models (original) (raw)
In the Query Focused Abstractive Summarization (QFAS) task, the goal is to generate abstractive summaries from the source document that are relevant to the given query. In this paper, we propose a new transfer learning technique by utilizing the pre-trained transformer architecture for the QFAS task in the Debatepedia dataset. We find that the Diversity Driven Attention model (DDA), which was the first model applied on this dataset, only performs well when the dataset is augmented by creating more training instances. In contrast, without requiring any in-domain data augmentation, our proposed approach outperforms the DDA model as well as sets a new state-of-the-art result.