Score: 0

Multi-Tool Analysis of User Interface & Accessibility in Deployed Web-Based Chatbots

Published: June 5, 2025 | arXiv ID: 2506.04659v1

By: Mukesh Rajmohan, Smit Desai, Sanchari Das

Potential Business Impact:

Finds chatbots that are hard for some people to use.

Business Areas:
Usability Testing Data and Analytics, Design

In this work, we present a multi-tool evaluation of 106 deployed web-based chatbots, across domains like healthcare, education and customer service, comprising both standalone applications and embedded widgets using automated tools (Google Lighthouse, PageSpeed Insights, SiteImprove Accessibility Checker) and manual audits (Microsoft Accessibility Insights). Our analysis reveals that over 80% of chatbots exhibit at least one critical accessibility issue, and 45% suffer from missing semantic structures or ARIA role misuse. Furthermore, we found that accessibility scores correlate strongly across tools (e.g., Lighthouse vs PageSpeed Insights, r = 0.861), but performance scores do not (r = 0.436), underscoring the value of a multi-tool approach. We offer a replicable evaluation insights and actionable recommendations to support the development of user-friendly conversational interfaces.

Country of Origin
🇺🇸 United States

Page Count
9 pages

Category
Computer Science:
Human-Computer Interaction