ghaith dekhili | Université du Québec à Montréal (original) (raw)
Related Authors
Graduate Center of the City University of New York
Uploads
Papers by ghaith dekhili
Knowledge Management and Acquisition for Intelligent Systems, 2019
Commonsense can be vital in some applications like Natural Language Understanding, where it is of... more Commonsense can be vital in some applications like Natural Language Understanding, where it is often required to resolve ambiguity arising from implicit knowledge and under-specification. In spite of the remarkable success of neural network approaches on a variety of Natural Language Processing tasks, many of them struggle to react effectively in cases that require commonsense knowledge.
Neural networks-based models have proved their efficiency on Named Entities Recognition, one of t... more Neural networks-based models have proved their efficiency on Named Entities Recognition, one of the well-known NLP task. Besides, attention mechanism has become an integral part of compelling sequence modeling and transduction models on various tasks. This technique allows context representation in a sequence by taking into consideration neighboring words. In this study, we propose an architecture that involves BiLSTM layers combined with a CRF layer and an attention layer in between. This was augmented with pre-trained contextualized word embeddings and dropout layers. Moreover, apart from using word representations, we use character-based representations, extracted by CNN layers, to capture morphological and orthographic information. Our experiments show an improvement in the overall performance. We notice that our attentive neural model augmented with contextualized word embeddings gives higher scores compared to our baselines. To the best of our knowledge, there is no study whic...
Commonsense can be vital in some applications like Natural Language Understanding (NLU), where it... more Commonsense can be vital in some applications like Natural Language Understanding (NLU), where it is often required to resolve ambiguity arising from implicit knowledge and underspecification. In spite of the remarkable success of neural network approaches on a variety of Natural Language Processing tasks, many of them struggle to react effectively in cases that require commonsense knowledge. In the present research, we take advantage of the availability of the open multilingual knowledge graph ConceptNet, by using it as an additional external resource in Named Entity Recognition (NER). Our proposed architecture involves BiLSTM layers combined with a CRF layer that was augmented with some features such as pre-trained word embedding layers and dropout layers. Moreover, apart from using word representations, we used also character-based representation to capture the morphological and the orthographic information. Our experiments and evaluations showed an improvement in the overall per...
Knowledge Management and Acquisition for Intelligent Systems, 2019
Commonsense can be vital in some applications like Natural Language Understanding, where it is of... more Commonsense can be vital in some applications like Natural Language Understanding, where it is often required to resolve ambiguity arising from implicit knowledge and under-specification. In spite of the remarkable success of neural network approaches on a variety of Natural Language Processing tasks, many of them struggle to react effectively in cases that require commonsense knowledge.
Neural networks-based models have proved their efficiency on Named Entities Recognition, one of t... more Neural networks-based models have proved their efficiency on Named Entities Recognition, one of the well-known NLP task. Besides, attention mechanism has become an integral part of compelling sequence modeling and transduction models on various tasks. This technique allows context representation in a sequence by taking into consideration neighboring words. In this study, we propose an architecture that involves BiLSTM layers combined with a CRF layer and an attention layer in between. This was augmented with pre-trained contextualized word embeddings and dropout layers. Moreover, apart from using word representations, we use character-based representations, extracted by CNN layers, to capture morphological and orthographic information. Our experiments show an improvement in the overall performance. We notice that our attentive neural model augmented with contextualized word embeddings gives higher scores compared to our baselines. To the best of our knowledge, there is no study whic...
Commonsense can be vital in some applications like Natural Language Understanding (NLU), where it... more Commonsense can be vital in some applications like Natural Language Understanding (NLU), where it is often required to resolve ambiguity arising from implicit knowledge and underspecification. In spite of the remarkable success of neural network approaches on a variety of Natural Language Processing tasks, many of them struggle to react effectively in cases that require commonsense knowledge. In the present research, we take advantage of the availability of the open multilingual knowledge graph ConceptNet, by using it as an additional external resource in Named Entity Recognition (NER). Our proposed architecture involves BiLSTM layers combined with a CRF layer that was augmented with some features such as pre-trained word embedding layers and dropout layers. Moreover, apart from using word representations, we used also character-based representation to capture the morphological and the orthographic information. Our experiments and evaluations showed an improvement in the overall per...