Antibody Foundational Model : Ab-RoBERTa
By: Eunna Huh, Hyeonsu Lee, Hyunjin Shin
Potential Business Impact:
Helps make new medicines from antibodies.
With the growing prominence of antibody-based therapeutics, antibody engineering has gained increasing attention as a critical area of research and development. Recent progress in transformer-based protein large language models (LLMs) has demonstrated promising applications in protein sequence design and structural prediction. Moreover, the availability of large-scale antibody datasets such as the Observed Antibody Space (OAS) database has opened new avenues for the development of LLMs specialized for processing antibody sequences. Among these, RoBERTa has demonstrated improved performance relative to BERT, while maintaining a smaller parameter count (125M) compared to the BERT-based protein model, ProtBERT (420M). This reduced model size enables more efficient deployment in antibody-related applications. However, despite the numerous advantages of the RoBERTa architecture, antibody-specific foundational models built upon it have remained inaccessible to the research community. In this study, we introduce Ab-RoBERTa, a RoBERTa-based antibody-specific LLM, which is publicly available at https://huggingface.co/mogam-ai/Ab-RoBERTa. This resource is intended to support a wide range of antibody-related research applications including paratope prediction or humanness assessment.
Similar Papers
Exploring Protein Language Model Architecture-Induced Biases for Antibody Comprehension
Machine Learning (CS)
Helps design better medicines by understanding body's defense.
Llama-Affinity: A Predictive Antibody Antigen Binding Model Integrating Antibody Sequences with Llama3 Backbone Architecture
Machine Learning (CS)
Finds best antibody drugs faster for sickness.
Improved Therapeutic Antibody Reformatting through Multimodal Machine Learning
Machine Learning (CS)
Helps doctors make better medicines faster.