Building Egocentric Procedural AI Assistant: Methods, Benchmarks, and Challenges
By: Junlong Li , Huaiyuan Xu , Sijie Cheng and more
Potential Business Impact:
Helps AI see and learn how to do tasks.
Driven by recent advances in vision language models (VLMs) and egocentric perception research, we introduce the concept of an egocentric procedural AI assistant (EgoProceAssist) tailored to step-by-step support daily procedural tasks in a first-person view. In this work, we start by identifying three core tasks: egocentric procedural error detection, egocentric procedural learning, and egocentric procedural question answering. These tasks define the essential functions of EgoProceAssist within a new taxonomy. Specifically, our work encompasses a comprehensive review of current techniques, relevant datasets, and evaluation metrics across these three core areas. To clarify the gap between the proposed EgoProceAssist and existing VLM-based AI assistants, we introduce novel experiments and provide a comprehensive evaluation of representative VLM-based methods. Based on these findings and our technical analysis, we discuss the challenges ahead and suggest future research directions. Furthermore, an exhaustive list of this study is publicly available in an active repository that continuously collects the latest work: https://github.com/z1oong/Building-Egocentric-Procedural-AI-Assistant
Similar Papers
Perceiving and Acting in First-Person: A Dataset and Benchmark for Egocentric Human-Object-Human Interactions
CV and Pattern Recognition
AI learns to help people by watching and listening.
TeleEgo: Benchmarking Egocentric AI Assistants in the Wild
CV and Pattern Recognition
Teaches AI to remember and understand your whole day.
PhysBrain: Human Egocentric Data as a Bridge from Vision Language Models to Physical Intelligence
Robotics
Teaches robots to learn from watching people.