Reading Decisions from Gaze Direction during Graphics Turing Test of Gait Animation
By: Benjamin Knopp , Daniel Auras , Alexander C. Schütz and more
Potential Business Impact:
Eyes show what you think about moving bodies.
We investigated gaze direction during movement observation. The eye movement data were collected during an experiment, in which different models of movement production (based on movement primitives, MPs) were compared in a two alternatives forced choice task (2AFC). Participants observed side-by-side presentation of two naturalistic 3D-rendered human movement videos, where one video was based on motion captured gait sequence, the other one was generated by recombining the machine-learned MPs to approximate the same movement. The task was to discriminate between these movements while their eye movements were recorded. We are complementing previous binary decision data analyses with eye tracking data. Here, we are investigating the role of gaze direction during task execution. We computed the shared information between gaze features and decisions of the participants, and between gaze features and correct answers. We found that eye movements reflect the decision of participants during the 2AFC task, but not the correct answer. This result is important for future experiments, which should take advantage of eye tracking to complement binary decision data.
Similar Papers
Quantifying the Impact of Motion on 2D Gaze Estimation in Real-World Mobile Interactions
Human-Computer Interaction
Makes phone eye-tracking work better when you move.
Eye Movements as Indicators of Deception: A Machine Learning Approach
Human-Computer Interaction
Helps computers spot lies by watching eyes.
Task Decoding based on Eye Movements using Synthetic Data Augmentation
Artificial Intelligence
Helps computers guess what you're looking at.