Real-Time Wayfinding Assistant for Blind and Low-Vision Users
By: Dabbrata Das, Argho Deb Das, Farhan Sadaf
Potential Business Impact:
Helps blind people navigate safely and quickly.
Navigating unfamiliar places continues to be one of the most persistent and essential everyday obstacles for those who are blind or have limited vision (BLV). Existing assistive technologies, such as GPS-based navigation systems, AI-powered smart glasses, and sonar-equipped canes, often face limitations in real-time obstacle avoidance, precise localization, and adaptability to dynamic surroundings. To investigate potential solutions, we introduced PathFinder, a novel map-less navigation system that explores different models for understanding 2D images, including Vision Language Models (VLMs), Large Language Models (LLMs), and employs monocular depth estimation for free-path detection. Our approach integrates a Depth-First Search (DFS) algorithm on depth images to determine the longest obstacle-free path, ensuring optimal route selection while maintaining computational efficiency. We conducted comparative evaluations against existing AI-powered navigation methods and performed a usability study with BLV participants. The results demonstrate that PathFinder achieves a favorable balance between accuracy, computational efficiency, and real-time responsiveness. Notably, it reduces mean absolute error (MAE) and improves decision-making speed in outdoor navigation compared to AI-based alternatives. Participant feedback emphasizes the system's usability and effectiveness in outside situations, but also identifies issues in complicated indoor locations and low-light conditions. Usability testing revealed that 73% of participants understood how to use the app in about a minute, and 80% praised its balance of accuracy, quick response, and overall convenience.
Similar Papers
Blind-Wayfarer: A Minimalist, Probing-Driven Framework for Resilient Navigation in Perception-Degraded Environments
Robotics
Helps robots find their way without seeing.
GuideNav: User-Informed Development of a Vision-Only Robotic Navigation Assistant For Blind Travelers
Robotics
Helps robots learn paths like guide dogs.
Fine-Tuning Vision-Language Models for Visual Navigation Assistance
CV and Pattern Recognition
Helps blind people navigate indoors with voice.