Shreyansh Singh - Academia.edu (original) (raw)

Uploads

Papers by Shreyansh Singh

Research paper thumbnail of MeTGAN: Memory Efficient Tabular GAN for High Cardinality Categorical Datasets

Communications in Computer and Information Science

Generative Adversarial Networks (GANs) have seen their use for generating synthetic data expand, ... more Generative Adversarial Networks (GANs) have seen their use for generating synthetic data expand, from unstructured data like images to structured tabular data. One of the recently proposed models in the field of tabular data generation, CTGAN, demonstrated state-of-the-art performance on this task even in the presence of a high class imbalance in categorical columns or multiple modes in continuous columns. Many of the recently proposed methods have also derived ideas from CTGAN. However, training CTGAN requires a high memory footprint while dealing with high cardinality categorical columns in the dataset. In this paper, we propose MeTGAN, a memory-efficient version of CTGAN, which reduces memory usage by roughly 80%, with a minimal effect on performance. MeTGAN uses sparse linear layers to overcome the memory bottlenecks of CTGAN. We compare the performance of MeTGAN with the other models on publicly available datasets. Quality of data generation, memory requirements, and the privacy guarantees of the models are the metrics considered in this study. The goal of this paper is also to draw the attention of the research community on the issue of the computational footprint of tabular data generation methods to enable them on larger datasets especially ones with high cardinality categorical variables.

Research paper thumbnail of IIT (BHU) Varanasi at MSR-SRST 2018: A Language Model Based Approach for Natural Language Generation

Proceedings of the First Workshop on Multilingual Surface Realisation

This paper describes our submission system for the Shallow Track of Surface Realization Shared Ta... more This paper describes our submission system for the Shallow Track of Surface Realization Shared Task 2018 (SRST'18). The task was to convert genuine UD structures, from which word order information had been removed and the tokens had been lemmatized, into their correct sentential form. We divide the problem statement into two parts, word reinflection and correct word order prediction. For the first sub-problem, we use a Long Short Term Memory based Encoder-Decoder approach. For the second sub-problem, we present a Language Model (LM) based approach. We apply two different subapproaches in the LM Based approach and the combined result of these two approaches is considered as the final output of the system.

Research paper thumbnail of CuRL: Coupled Representation Learning of Cards and Merchants to Detect Transaction Frauds

Lecture Notes in Computer Science

Research paper thumbnail of MeTGAN: Memory Efficient Tabular GAN for High Cardinality Categorical Datasets

Communications in Computer and Information Science

Generative Adversarial Networks (GANs) have seen their use for generating synthetic data expand, ... more Generative Adversarial Networks (GANs) have seen their use for generating synthetic data expand, from unstructured data like images to structured tabular data. One of the recently proposed models in the field of tabular data generation, CTGAN, demonstrated state-of-the-art performance on this task even in the presence of a high class imbalance in categorical columns or multiple modes in continuous columns. Many of the recently proposed methods have also derived ideas from CTGAN. However, training CTGAN requires a high memory footprint while dealing with high cardinality categorical columns in the dataset. In this paper, we propose MeTGAN, a memory-efficient version of CTGAN, which reduces memory usage by roughly 80%, with a minimal effect on performance. MeTGAN uses sparse linear layers to overcome the memory bottlenecks of CTGAN. We compare the performance of MeTGAN with the other models on publicly available datasets. Quality of data generation, memory requirements, and the privacy guarantees of the models are the metrics considered in this study. The goal of this paper is also to draw the attention of the research community on the issue of the computational footprint of tabular data generation methods to enable them on larger datasets especially ones with high cardinality categorical variables.

Research paper thumbnail of IIT (BHU) Varanasi at MSR-SRST 2018: A Language Model Based Approach for Natural Language Generation

Proceedings of the First Workshop on Multilingual Surface Realisation

This paper describes our submission system for the Shallow Track of Surface Realization Shared Ta... more This paper describes our submission system for the Shallow Track of Surface Realization Shared Task 2018 (SRST'18). The task was to convert genuine UD structures, from which word order information had been removed and the tokens had been lemmatized, into their correct sentential form. We divide the problem statement into two parts, word reinflection and correct word order prediction. For the first sub-problem, we use a Long Short Term Memory based Encoder-Decoder approach. For the second sub-problem, we present a Language Model (LM) based approach. We apply two different subapproaches in the LM Based approach and the combined result of these two approaches is considered as the final output of the system.

Research paper thumbnail of CuRL: Coupled Representation Learning of Cards and Merchants to Detect Transaction Frauds

Lecture Notes in Computer Science

Log In