Motion-Based User Identification across XR and Metaverse Applications by Deep Classification and Similarity Learning
By: Lukas Schach , Christian Rack , Ryan P. McMahan and more
Potential Business Impact:
Identifies people by how they move in VR.
This paper examines the generalization capacity of two state-of-the-art classification and similarity learning models in reliably identifying users based on their motions in various Extended Reality (XR) applications. We developed a novel dataset containing a wide range of motion data from 49 users in five different XR applications: four XR games with distinct tasks and action patterns, and an additional social XR application with no predefined task sets. The dataset is used to evaluate the performance and, in particular, the generalization capacity of the two models across applications. Our results indicate that while the models can accurately identify individuals within the same application, their ability to identify users across different XR applications remains limited. Overall, our results provide insight into current models generalization capabilities and suitability as biometric methods for user verification and identification. The results also serve as a much-needed risk assessment of hazardous and unwanted user identification in XR and Metaverse applications. Our cross-application XR motion dataset and code are made available to the public to encourage similar research on the generalization of motion-based user identification in typical Metaverse application use cases.
Similar Papers
Unobtrusive In-Situ Measurement of Behavior Change by Deep Metric Similarity Learning of Motion Patterns
Human-Computer Interaction
Tracks how virtual bodies change how you act.
Behavioral Biometrics for Automatic Detection of User Familiarity in VR
Human-Computer Interaction
VR knows if you're new or experienced.
Behavioral Biometrics for Automatic Detection of User Familiarity in VR
Human-Computer Interaction
VR knows if you're new or experienced.