What Do People Want to Know About Artificial Intelligence (AI)? The Importance of Answering End-User Questions to Explain Autonomous Vehicle (AV) Decisions
By: Somayeh Molaei, Lionel P. Robert, Nikola Banovic
Potential Business Impact:
Explains why self-driving cars make choices.
Improving end-users' understanding of decisions made by autonomous vehicles (AVs) driven by artificial intelligence (AI) can improve utilization and acceptance of AVs. However, current explanation mechanisms primarily help AI researchers and engineers in debugging and monitoring their AI systems, and may not address the specific questions of end-users, such as passengers, about AVs in various scenarios. In this paper, we conducted two user studies to investigate questions that potential AV passengers might pose while riding in an AV and evaluate how well answers to those questions improve their understanding of AI-driven AV decisions. Our initial formative study identified a range of questions about AI in autonomous driving that existing explanation mechanisms do not readily address. Our second study demonstrated that interactive text-based explanations effectively improved participants' comprehension of AV decisions compared to simply observing AV decisions. These findings inform the design of interactions that motivate end-users to engage with and inquire about the reasoning behind AI-driven AV decisions.
Similar Papers
Improving Human-Autonomous Vehicle Interaction in Complex Systems
Human-Computer Interaction
Makes self-driving cars better for different people.
Towards Balancing Preference and Performance through Adaptive Personalized Explainability
Human-Computer Interaction
Helps robots explain their choices to people.
"I don't like things where I do not have control": Participants' Experience of Trustworthy Interaction with Autonomous Vehicles
Human-Computer Interaction
Makes self-driving cars more trusted by people.