Explore, Listen, Inspect: Supporting Multimodal Interaction with 3D Surface and Point Data Visualizations
By: Sanchita S. Kamath, Aziz N. Zeidieh, JooYoung Seo
Potential Business Impact:
Lets blind people explore 3D shapes with sound.
Blind and low-vision (BLV) users remain largely excluded from three-dimensional (3D) surface and point data visualizations due to the reliance on visual interaction. Existing approaches inadequately support non-visual access, especially in browser-based environments. This study introduces DIXTRAL, a hosted web-native system, co-designed with BLV researchers to address these gaps through multimodal interaction. Conducted with two blind and one sighted researcher, this study took place over sustained design sessions. Data were gathered through iterative testing of the prototype, collecting feedback on spatial navigation, sonification, and usability. Co-design observations demonstrate that synchronized auditory, visual, and textual feedback, combined with keyboard and gamepad navigation, enhances both structure discovery and orientation. DIXTRAL aims to improve access to 3D continuous scalar fields for BLV users and inform best practices for creating inclusive 3D visualizations.
Similar Papers
RAVEN: Realtime Accessibility in Virtual ENvironments for Blind and Low-Vision People
Human-Computer Interaction
Helps blind people explore virtual worlds by talking.
Sensing the Shape of Data: Non-Visual Exploration of Statistical Concepts in Histograms with Blind and Low-Vision Learners
Human-Computer Interaction
Helps blind people learn math using touch and sound.
Co-Designing Multimodal Systems for Accessible Remote Dance Instruction
Human-Computer Interaction
Helps blind dancers learn moves using sound and touch.