Score: 0

Across Programming Language Silos: A Study on Cross-Lingual Retrieval-augmented Code Generation

Published: June 4, 2025 | arXiv ID: 2506.03535v1

By: Qiming Zhu , Jialun Cao , Xuanang Chen and more

Potential Business Impact:

Helps computers translate code between languages better.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

Current research on large language models (LLMs) with retrieval-augmented code generation (RACG) mainly focuses on single-language settings, leaving cross-lingual effectiveness and security unexplored. Multi-lingual RACG systems are valuable for migrating code-bases across programming languages (PLs), yet face risks from error (e.g. adversarial data corruption) propagation in cross-lingual transfer. We construct a dataset spanning 13 PLs with nearly 14k instances to explore utility and robustness of multi-lingual RACG systems. Our investigation reveals four key insights: (1) Effectiveness: multi-lingual RACG significantly enhances multi-lingual code LLMs generation; (2) Inequality: Java demonstrate superior cross-lingual utility over Python in RACG; (3) Robustness: Adversarial attacks degrade performance significantly in mono-lingual RACG but show mitigated impacts in cross-lingual scenarios; Counterintuitively, perturbed code may improve RACG in cross-lingual scenarios; (4) Specialization: Domain-specific code retrievers outperform significantly general text retrievers. These findings establish foundation for developing effective and secure multi-lingual code assistants.

Country of Origin
🇭🇰 Hong Kong

Page Count
12 pages

Category
Computer Science:
Software Engineering