BYOS: Knowledge-driven Large Language Models Bring Your Own Operating System More Excellent
By: Hongyu Lin , Yuchen Li , Haoran Luo and more
Potential Business Impact:
Makes computers run much faster by fixing their settings.
Operating System (OS) kernel tuning involves systematically adjusting kernel configurations to optimize system performance. Despite recent advancements in large language models (LLMs), kernel tuning remains a critical challenge due to: (1) the semantic gap between abstract tuning objective and concrete config options, (2) insufficient environmental interaction induces LLM hallucinations, and (3) the rapid evolution of kernel versions. To address these challenges, we propose BYOS, a LLM-powered framework that automates kernel tuning through three key innovations: structured knowledge construction and mapping, knowledge-driven configuration generation, and continuous knowledge maintenance. Extensive experiments show that BYOS achieves 7.1%-155.4% performance improvements over default configurations across standard OS benchmarks and real-world applications, demonstrating structured knowledge representation can overcome key limitations of pure LLM solutions for system optimization. Our code is available at https://github.com/LHY-24/BYOS.
Similar Papers
OSVBench: Benchmarking LLMs on Specification Generation Tasks for Operating System Verification
Computation and Language
Tests if AI can write code for computer brains.
Composable OS Kernel Architectures for Autonomous Intelligence
Operating Systems
Makes computers think and learn like brains.
MaLV-OS: Rethinking the Operating System Architecture for Machine Learning in Virtualized Clouds
Operating Systems
Makes computer programs learn faster.