NRGPT: An Energy-based Alternative for GPT
By: Nima Dehmamy , Benjamin Hoover , Bishwajit Saha and more
Generative Pre-trained Transformer (GPT) architectures are the most popular design for language modeling. Energy-based modeling is a different paradigm that views inference as a dynamical process operating on an energy landscape. We propose a minimal modification of the GPT setting to unify it with the EBM framework. The inference step of our model, which we call eNeRgy-GPT (NRGPT), is conceptualized as an exploration of the tokens on the energy landscape. We prove, and verify empirically, that under certain circumstances this exploration becomes gradient descent, although they don't necessarily lead to the best performing models. We demonstrate that our model performs well for simple language (Shakespeare dataset), algebraic ListOPS tasks, and richer settings such as OpenWebText language modeling. We also observe that our models may be more resistant to overfitting, doing so only during very long training.
Similar Papers
Towards EnergyGPT: A Large Language Model Specialized for the Energy Sector
Computation and Language
EnergyGPT understands and writes about energy better.
GPT-FT: An Efficient Automated Feature Transformation Using GPT for Sequence Reconstruction and Performance Enhancement
Machine Learning (CS)
Makes computers learn better, faster, and with less data.
NNGPT: Rethinking AutoML with Large Language Models
Artificial Intelligence
AI builds better AI, learning from its own mistakes.