Paper Contents
Abstract
- Language modelling, a key task in Natural Language Processing (NLP), involves predicting the subsequent word and has a wide array of applications. We utilized Nietzsches default text record to construct a model that predicts the next words after a user has input n letters. This model, developed with Recurrent Neural Network (RNN) and Tensor flow, comprehends n letters and forecasts the top words. Our objective was to predict 10 or more words in the shortest possible time. Thanks to the long short-term memory of the RNN, it can understand previous content and predict words, which aids users in forming sentences. This can improve users lives and minimize risks. The model is also trained to comprehend and predict words in Hinglish. We introduce a Bi-Directional Long Short Term Memory (LSTM) network, a unique type of Neural Network. In this Recurrent Neural Network (RNN), our goal is to predict the next word for a given set of words in the model.
Copyright
Copyright © 2023 Atharv PatilPatil. This is an open access article distributed under the Creative Commons Attribution License.