Facial Emotion Recognition does not detect feeling unsafe in automated driving
By: Abel van Elburg , Konstantinos Gkentsidis , Mathieu Sarrazin and more
Potential Business Impact:
Makes self-driving cars feel safer to ride.
Trust and perceived safety play a crucial role in the public acceptance of automated vehicles. To understand perceived risk, an experiment was conducted using a driving simulator under two automated driving styles and optionally introducing a crossing pedestrian. Data was collected from 32 participants, consisting of continuous subjective comfort ratings, motion, webcam footage for facial expression, skin conductance, heart rate, and eye tracking. The continuous subjective perceived risk ratings showed significant discomfort associated with perceived risk during cornering and braking followed by relief or even positive comfort on continuing the ride. The dynamic driving style induced a stronger discomfort as compared to the calm driving style. The crossing pedestrian did not affect discomfort with the calm driving style but doubled the comfort decrement with the dynamic driving style. This illustrates the importance of consequences of critical interactions in risk perception. Facial expression was successfully analyzed for 24 participants but most (15/24) did not show any detectable facial reaction to the critical event. Among the 9 participants who did, 8 showed a Happy expression, and only 4 showed a Surprise expression. Fear was never dominant. This indicates that facial expression recognition is not a reliable method for assessing perceived risk in automated vehicles. To predict perceived risk a neural network model was implemented using vehicle motion and skin conductance. The model correlated well with reported perceived risk, demonstrating its potential for objective perceived risk assessment in automated vehicles, reducing subjective bias and highlighting areas for future research.
Similar Papers
Reading minds on the road: decoding perceived risk in automated vehicles through 140K+ ratings
Human-Computer Interaction
Makes self-driving cars understand rider fear.
The Mediating Effects of Emotions on Trust through Risk Perception and System Performance in Automated Driving
Human-Computer Interaction
Makes self-driving cars more trustworthy through feelings.
Research on a Driver's Perceived Risk Prediction Model Considering Traffic Scene Interaction
Human-Computer Interaction
Makes self-driving cars safer by predicting danger.