Score: 0

LLM Empowered Prototype Learning for Zero and Few-Shot Tasks on Tabular Data

Published: August 12, 2025 | arXiv ID: 2508.09263v1

By: Peng Wang , Dongsheng Wang , He Zhao and more

Potential Business Impact:

Computers learn from tables without examples.

Recent breakthroughs in large language models (LLMs) have opened the door to in-depth investigation of their potential in tabular data modeling. However, effectively utilizing advanced LLMs in few-shot and even zero-shot scenarios is still challenging. To this end, we propose a novel LLM-based prototype estimation framework for tabular learning. Our key idea is to query the LLM to generate feature values based example-free prompt, which solely relies on task and feature descriptions. With the feature values generated by LLM, we can build a zero-shot prototype in a training-free manner, which can be further enhanced by fusing few-shot samples, avoiding training a classifier or finetuning the LLMs. Thanks to the example-free prompt and prototype estimation, ours bypasses the constraints brought by the example-based prompt, providing a scalable and robust framework. Extensive experiments demonstrate the effectiveness of ours in zero and few-shot tabular learning.

Country of Origin
🇨🇳 China

Page Count
23 pages

Category
Computer Science:
Machine Learning (CS)