External Human-Machine Interface based on Intent Recognition: Framework Design and Experimental Validation
By: Boya Sun , Haotian Shi , Ying Ni and more
Increasing autonomous vehicles (AVs) in transportation systems makes effective interactions between AVs and pedestrians indispensable. External human--machine interface (eHMI), which employs visual or auditory cues to explicitly convey vehicle behaviors can compensate for the loss of human-like interactions and enhance AV--pedestrian cooperation. To facilitate faster intent convergence between pedestrian and AVs, this study incorporates an adaptive interaction mechanism into eHMI based on pedestrian intent recognition, namely IR-eHMI. IR-eHMI dynamically detects and infers the behavioral intentions of both pedestrians and AVs through identifying their cooperation states. The proposed interaction framework is implemented and evaluated on a virtual reality (VR) experimental platform to demonstrate its effectiveness through statistical analysis. Experimental results show that IR-eHMI significantly improves crossing efficiency, reduces gaze distraction while maintaining interaction safety compared to traditional fixed-distance eHMI. This adaptive and explicit interaction mode introduces an innovative procedural paradigm for AV--pedestrian cooperation.
Similar Papers
Towards Adaptive External Communication in Autonomous Vehicles: A Conceptual Design Framework
Human-Computer Interaction
Cars will talk to people outside better.
Impact of eHMI on Pedestrians' Interactions with Level-5 Automated Driving Systems
Human-Computer Interaction
Helps self-driving cars tell people when to cross.
Enhancing Autonomous Vehicle-Pedestrian Interaction in Shared Spaces: The Impact of Intended Path-Projection
Human-Computer Interaction
Shows where self-driving cars will go.