Score: 1

Can Language Models Compose Skills In-Context?

Published: October 27, 2025 | arXiv ID: 2510.22993v1

By: Zidong Liu , Zhuoyan Xu , Zhenmei Shi and more

Potential Business Impact:

Teaches computers to combine simple skills for complex tasks.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

Composing basic skills from simple tasks to accomplish composite tasks is crucial for modern intelligent systems. We investigate the in-context composition ability of language models to perform composite tasks that combine basic skills demonstrated in in-context examples. This is more challenging than the standard setting, where skills and their composition can be learned in training. We conduct systematic experiments on various representative open-source language models, utilizing linguistic and logical tasks designed to probe composition abilities. The results reveal that simple task examples can have a surprising negative impact on the performance, because the models generally struggle to recognize and assemble the skills correctly, even with Chain-of-Thought examples. Theoretical analysis further shows that it is crucial to align examples with the corresponding steps in the composition. This inspires a method for the probing tasks, whose improved performance provides positive support for our insights.

Country of Origin
πŸ‡ΊπŸ‡Έ πŸ‡­πŸ‡° Hong Kong, United States

Page Count
34 pages

Category
Computer Science:
Machine Learning (CS)