The Art of Asking: Multilingual Prompt Optimization for Synthetic Data
By: David Mora , Viraat Aryabumi , Wei-Yin Ko and more
Potential Business Impact:
Makes AI understand many languages better.
Synthetic data has become a cornerstone for scaling large language models, yet its multilingual use remains bottlenecked by translation-based prompts. This strategy inherits English-centric framing and style and neglects cultural dimensions, ultimately constraining model generalization. We argue that the overlooked prompt space-the very inputs that define training distributions-offers a more powerful lever for improving multilingual performance. We introduce a lightweight framework for prompt-space optimization, where translated prompts are systematically transformed for Naturalness, Cultural Adaptation, and Difficulty Enhancement. Using an off-the-shelf multilingual LLM, we apply these transformations to prompts for 12 languages spanning 7 families. Under identical data conditions, our approaches achieve substantial and consistent downstream improvements over the translation-only baseline: +4.7% on Global-MMLU accuracy, +2.4% on Flores XCometXL and +35.3% wins in preferences on mArenaHard. We establish prompt-space optimization as a simple yet powerful paradigm for building multilingual LLMs that are more robust, culturally grounded, and globally capable.
Similar Papers
Cross-Lingual Prompt Steerability: Towards Accurate and Robust LLM Behavior across Languages
Computation and Language
Makes AI understand and work in many languages.
LatentPrompt: Optimizing Promts in Latent Space
Computation and Language
Makes AI understand jobs better, automatically.
Multilingual Prompt Engineering in Large Language Models: A Survey Across NLP Tasks
Computation and Language
Helps computers understand many languages better.