Diksha Chugh - Academia.edu (original) (raw)

Diksha Chugh

Uploads

Papers by Diksha Chugh

Research paper thumbnail of Automated News Summarization Using Transformers

Lecture notes in electrical engineering, 2022

The amount of text data available online is increasing at a very fast pace hence text summarizati... more The amount of text data available online is increasing at a very fast pace hence text summarization has become essential. Most of the modern recommender and text classification systems require going through a huge amount of data. Manually generating precise and fluent summaries of lengthy articles is a very tiresome and time-consuming task. Hence generating automated summaries for the data and using it to train machine learning models will make these models space and time-efficient. Extractive summarization and abstractive summarization are two separate methods of generating summaries. The extractive technique identifies the relevant sentences from the original document and extracts only those from the text. Whereas in abstractive summarization techniques, the summary is generated after interpreting the original text, hence making it more complicated. In this paper, we will be presenting a comprehensive comparison of a few transformer architecture based pre-trained models for text summarization. For analysis and comparison, we have used the BBC news dataset that contains text data that can be used for summarization and human generated summaries for evaluating and comparing the summaries generated by machine learning models.

Research paper thumbnail of Context-Aware Emoji Prediction Using Deep Learning

Artificial Intelligence and Speech Technology, 2022

Research paper thumbnail of iCAST: Impact of Climate on Assistive Scene Text detection for autonomous vehicles

2022 8th International Conference on Advanced Computing and Communication Systems (ICACCS)

Research paper thumbnail of Automated News Summarization Using Transformers

Lecture notes in electrical engineering, 2022

The amount of text data available online is increasing at a very fast pace hence text summarizati... more The amount of text data available online is increasing at a very fast pace hence text summarization has become essential. Most of the modern recommender and text classification systems require going through a huge amount of data. Manually generating precise and fluent summaries of lengthy articles is a very tiresome and time-consuming task. Hence generating automated summaries for the data and using it to train machine learning models will make these models space and time-efficient. Extractive summarization and abstractive summarization are two separate methods of generating summaries. The extractive technique identifies the relevant sentences from the original document and extracts only those from the text. Whereas in abstractive summarization techniques, the summary is generated after interpreting the original text, hence making it more complicated. In this paper, we will be presenting a comprehensive comparison of a few transformer architecture based pre-trained models for text summarization. For analysis and comparison, we have used the BBC news dataset that contains text data that can be used for summarization and human generated summaries for evaluating and comparing the summaries generated by machine learning models.

Research paper thumbnail of Context-Aware Emoji Prediction Using Deep Learning

Artificial Intelligence and Speech Technology, 2022

Research paper thumbnail of iCAST: Impact of Climate on Assistive Scene Text detection for autonomous vehicles

2022 8th International Conference on Advanced Computing and Communication Systems (ICACCS)

Log In