Score: 1

iSHIFT: Lightweight Slow-Fast GUI Agent with Adaptive Perception

Published: December 26, 2025 | arXiv ID: 2512.22009v1

By: Sarthak Mehrotra , Sairam V C Rebbapragada , Mani Hemanth Reddy Bonthu and more

Potential Business Impact:

Helps computers understand and use apps better.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

Multimodal Large Language Models (MLLMs) show strong potential for interpreting and interacting with complex, pixel-rich Graphical User Interface (GUI) environments. However, building agents that are both efficient for high-level tasks and precise for fine-grained interactions remains challenging. GUI agents must perform routine actions efficiently while also handling tasks that demand exact visual grounding, yet existing approaches struggle when accuracy depends on identifying specific interface elements. These MLLMs also remain large and cannot adapt their reasoning depth to the task at hand. In this work, we introduce iSHIFT: Implicit Slow-fast Hybrid Inference with Flexible Tokens, a lightweight agent that integrates latent thinking (implicit chain-of-thought) with a perception control module. iSHIFT enables an MLLM to switch between a slow mode, which leverages detailed visual grounding for high precision and a fast mode that uses global cues for efficiency. Special perception tokens guide attention to relevant screen regions, allowing the model to decide both how to reason and where to focus. Despite its compact 2.5B size, iSHIFT matches state-of-the-art performance on multiple benchmark datasets.

Country of Origin
🇮🇳 India

Page Count
34 pages

Category
Computer Science:
CV and Pattern Recognition