Score: 0

GPU accelerated surface-based gaze mapping for XR experiences

Published: January 12, 2026 | arXiv ID: 2601.07571v1

By: Charles Javerliat, Guillaume Lavoué

Extended reality is a fast-growing domain for which there is an increasing need to analyze and understand user behavior. In particular, understanding human visual attention during immersive experiences is crucial for many applications. The visualization and analysis of visual attention are commonly done by building fixation density maps from eye-tracking data. Such visual attention mapping is well mastered for 3 degrees of freedom (3DoF) experiences (\textit{i.e.}, involving 360 images or videos) but much less so for 6DoFs data, when the user can move freely in the 3D space. In that case, the visual attention information has to be mapped onto the 3D objects themselves. Some solutions exist for constructing such surface-based 6DoFs attention maps, however, they own several drawbacks: processing time, strong dependence on mesh resolution and/or texture mapping, and/or unpractical data representation for further processing. In this context, we propose a novel GPU-based algorithm that resolves the issues above while being generated in interactive time and rendered in real-time. Experiment on a challenging scene demonstrates the accuracy and robustness of our approach. To stimulate research in this area, the source code is publicly released and integrated into PLUME for ease of use in XR experiments.

Category
Computer Science:
Human-Computer Interaction