Mansour Saffar Mehrjardi | University of Tehran (original) (raw)

Address: Tehran, Tehran, Iran, Islamic Republic of

less

Uploads

Papers by Mansour Saffar Mehrjardi

Research paper thumbnail of Self-Attentional Models Application in Task-Oriented Dialogue Generation Systems

Proceedings - Natural Language Processing in a Deep Learning World

Self-attentional models are a new paradigm for sequence modelling tasks which differ from common ... more Self-attentional models are a new paradigm for sequence modelling tasks which differ from common sequence modelling methods, such as recurrencebased and convolution-based sequence learning, in the way that their architecture is only based on the attention mechanism. Self-attentional models have been used in the creation of the state-of-the-art models in many NLP tasks such as neural machine translation, but their usage has not been explored for the task of training end-toend task-oriented dialogue generation systems yet. In this study, we apply these models on the three different datasets for training task-oriented chatbots. Our finding shows that self-attentional models can be exploited to create end-to-end taskoriented chatbots which not only achieve higher evaluation scores compared to recurrence-based models, but also do so more efficiently.

Research paper thumbnail of Self-Attentional Models Application in Task-Oriented Dialogue Generation Systems

Proceedings - Natural Language Processing in a Deep Learning World

Self-attentional models are a new paradigm for sequence modelling tasks which differ from common ... more Self-attentional models are a new paradigm for sequence modelling tasks which differ from common sequence modelling methods, such as recurrencebased and convolution-based sequence learning, in the way that their architecture is only based on the attention mechanism. Self-attentional models have been used in the creation of the state-of-the-art models in many NLP tasks such as neural machine translation, but their usage has not been explored for the task of training end-toend task-oriented dialogue generation systems yet. In this study, we apply these models on the three different datasets for training task-oriented chatbots. Our finding shows that self-attentional models can be exploited to create end-to-end taskoriented chatbots which not only achieve higher evaluation scores compared to recurrence-based models, but also do so more efficiently.

Log In