Next Word Suggestion using Graph Neural Network
By: Abisha Thapa Magar, Anup Shakya
Potential Business Impact:
Helps computers guess the next word better.
Language Modeling is a prevalent task in Natural Language Processing. The currently existing most recent and most successful language models often tend to build a massive model with billions of parameters, feed in a tremendous amount of text data, and train with enormous computation resources which require millions of dollars. In this project, we aim to address an important sub-task in language modeling, i.e., context embedding. We propose an approach to exploit the Graph Convolution operation in GNNs to encode the context and use it in coalition with LSTMs to predict the next word given a local context of preceding words. We test this on the custom Wikipedia text corpus using a very limited amount of resources and show that this approach works fairly well to predict the next word.
Similar Papers
From Text to Graph: Leveraging Graph Neural Networks for Enhanced Explainability in NLP
Computation and Language
Turns sentences into pictures to explain AI.
Research on Personalized Financial Product Recommendation by Integrating Large Language Models and Graph Neural Networks
Information Retrieval
Suggests best money products for you.
Context-level Language Modeling by Learning Predictive Context Embeddings
Computation and Language
Makes AI understand stories better, not just words.