How Far I'll Go: Imagining Futures of Conversational AI with People with Visual Impairments Through Design Fiction
By: Jeanne Choi , Dasom Choi , Sejun Jeong and more
Potential Business Impact:
Helps blind people use future talking robots.
People with visual impairments (PVI) use a variety of assistive technologies to navigate their daily lives, and conversational AI (CAI) tools are a growing part of this toolset. Much existing HCI research has focused on the technical capabilities of current CAI tools, but in this paper, we instead examine how PVI themselves envision potential futures for living with CAI. We conducted a study with 14 participants with visual impairments using an audio-based Design Fiction probe featuring speculative dialogues between participants and a future CAI. Participants imagined using CAI to expand their boundaries by exploring new opportunities or places, but also voiced concerns about balancing reliance on CAI with maintaining autonomy, the need to consider diverse levels of vision-loss, and enhancing visibility of PVI for greater inclusion. We discuss implications for designing CAI that support genuine agency for PVI based on the future lives they envisioned.
Similar Papers
Probing the Gaps in ChatGPT Live Video Chat for Real-World Assistance for People who are Blind or Visually Impaired
Human-Computer Interaction
AI helps blind people see with live video.
Convivial Conversational Agents -- shifting toward relationships
Human-Computer Interaction
Helps AI talk to people with memory loss.
AI-Powered Assistive Technologies for Visual Impairment
Human-Computer Interaction
Helps blind people understand the world around them.