Score: 0

Reading Decisions from Gaze Direction during Graphics Turing Test of Gait Animation

Published: March 24, 2025 | arXiv ID: 2503.18619v1

By: Benjamin Knopp , Daniel Auras , Alexander C. Schütz and more

Potential Business Impact:

Eyes show what you think about moving bodies.

Business Areas:
Motion Capture Media and Entertainment, Video

We investigated gaze direction during movement observation. The eye movement data were collected during an experiment, in which different models of movement production (based on movement primitives, MPs) were compared in a two alternatives forced choice task (2AFC). Participants observed side-by-side presentation of two naturalistic 3D-rendered human movement videos, where one video was based on motion captured gait sequence, the other one was generated by recombining the machine-learned MPs to approximate the same movement. The task was to discriminate between these movements while their eye movements were recorded. We are complementing previous binary decision data analyses with eye tracking data. Here, we are investigating the role of gaze direction during task execution. We computed the shared information between gaze features and decisions of the participants, and between gaze features and correct answers. We found that eye movements reflect the decision of participants during the 2AFC task, but not the correct answer. This result is important for future experiments, which should take advantage of eye tracking to complement binary decision data.

Country of Origin
🇩🇪 Germany

Page Count
14 pages

Category
Computer Science:
Human-Computer Interaction