Score: 1

MIRA: Empowering One-Touch AI Services on Smartphones with MLLM-based Instruction Recommendation

Published: September 17, 2025 | arXiv ID: 2509.13773v1

By: Zhipeng Bian , Jieming Zhu , Xuyang Xie and more

BigTech Affiliations: Huawei

Potential Business Impact:

Lets your phone suggest what AI to use.

Business Areas:
Machine Learning Artificial Intelligence, Data and Analytics, Software

The rapid advancement of generative AI technologies is driving the integration of diverse AI-powered services into smartphones, transforming how users interact with their devices. To simplify access to predefined AI services, this paper introduces MIRA, a pioneering framework for task instruction recommendation that enables intuitive one-touch AI tasking on smartphones. With MIRA, users can long-press on images or text objects to receive contextually relevant instruction recommendations for executing AI tasks. Our work introduces three key innovations: 1) A multimodal large language model (MLLM)-based recommendation pipeline with structured reasoning to extract key entities, infer user intent, and generate precise instructions; 2) A template-augmented reasoning mechanism that integrates high-level reasoning templates, enhancing task inference accuracy; 3) A prefix-tree-based constrained decoding strategy that restricts outputs to predefined instruction candidates, ensuring coherent and intent-aligned suggestions. Through evaluation using a real-world annotated datasets and a user study, MIRA has demonstrated substantial improvements in the accuracy of instruction recommendation. The encouraging results highlight MIRA's potential to revolutionize the way users engage with AI services on their smartphones, offering a more seamless and efficient experience.

Country of Origin
🇨🇳 China

Page Count
9 pages

Category
Computer Science:
Artificial Intelligence