Text Generation using Neural Models
Khushboo Lathia1, Mahesh Maurya2

1Khusboo Lathia, Department of Computer Engineering, MPSTME, NMIMS University, Mumbai (Maharashtra), India.

2Mahesh Maurya, Department of Computer Engineering, MPSTME, NMIMS University, Mumbai (Maharashtra), India. 

Manuscript received on 03 December 2019 | Revised Manuscript received on 11 December 2019 | Manuscript Published on 31 December 2019 | PP: 19-23 | Volume-9 Issue-2S December 2019 | Retrieval Number: B10061292S19/2019©BEIESP | DOI: 10.35940/ijitee.B1006.1292S19

Open Access | Editorial and Publishing Policies | Cite | Mendeley | Indexing and Abstracting
© The Authors. Blue Eyes Intelligence Engineering and Sciences Publication (BEIESP). This is an open-access article under the CC-BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/)

Abstract: The use of automatically generated summaries for long/short texts is commonly used in digital services. In this Paper, a successful approach at text generation using generative adversarial networks (GAN) has been studied. In this paper, we have studied various neural models for text generation. Our main focus was on generating text using Recurrent Neural Network (RNN) and its variants and analyze its result. We have generated and translated text varying number of epochs and temperature to improve the confidence of the model as well as by varying the size of input file. We were amazed to see how the Long-Short Term Memory (LSTM) model responded to these varying parameters. The performance of LSTMs was better when the appropriate size of dataset was given to the model for training. The resulting model is tested on different datasets originating of varying sizes. The evaluations show that the output generated by the model do not correlate with the corresponding datasets which means that the generated output is different from the dataset.

Keywords: Text Generation, Recurrent Neural Networks, LSTM, GRU, Adversarial Training Machine Translation.
Scope of the Article: Text Mining