Score: 0

Protein Language Model Zero-Shot Fitness Predictions are Improved by Inference-only Dropout

Published: May 31, 2025 | arXiv ID: 2506.14793v1

By: Aditya Ravuri, Neil D. Lawrence

Potential Business Impact:

Improves computer guesses about protein health.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

Protein Language Models (PLMs) such as ESM2 have been shown to be capable of zero-shot prediction of critical scalar properties of proteins (fitness). In this work, we show that injecting a dropout layer at inference time between a PLM's featurizer/embedding layer and its transformer, and averaging its output akin to Monte-Carlo dropout increases zero-shot performance on a subset of the ProteinGym dataset. This is the case even when the model was not trained with dropouts to begin with, and does not require retraining or finetuning of the PLM. A dropout of 0.1 seems performant across all models.

Country of Origin
🇬🇧 United Kingdom

Page Count
3 pages

Category
Computer Science:
Machine Learning (CS)