LLM-Driven Accessible Interface: A Model-Based Approach
By: Blessing Jerry , Lourdes Moreno , Virginia Francisco and more
The integration of Large Language Models (LLMs) into interactive systems opens new opportunities for adaptive user experiences, yet it also raises challenges regarding accessibility, explainability, and normative compliance. This paper presents an implemented model-driven architecture for generating personalised, multimodal, and accessibility-aligned user interfaces. The approach combines structured user profiles, declarative adaptation rules, and validated prompt templates to refine baseline accessible UI templates that conform to WCAG 2.2 and EN 301 549, tailored to cognitive and sensory support needs. LLMs dynamically transform language complexity, modality, and visual structure, producing outputs such as Plain-Language text, pictograms, and high-contrast layouts aligned with ISO 24495-1 and W3C COGA guidance. A healthcare use case demonstrates how the system generates accessible post-consultation medication instructions tailored to a user profile comprising cognitive disability and hearing impairment. SysML v2 models provide explicit traceability between user needs, adaptation rules, and normative requirements, ensuring explainable and auditable transformations. Grounded in Human-Centered AI (HCAI), the framework incorporates co-design processes and structured feedback mechanisms to guide iterative refinement and support trustworthy generative behaviour.
Similar Papers
Towards LLM-Based Usability Analysis for Recommender User Interfaces
Human-Computer Interaction
Helps make apps easier to use by checking their design.
Designing Effective LLM-Assisted Interfaces for Curriculum Development
Computers and Society
Makes AI easier for teachers to use.
Generative Interfaces for Language Models
Computation and Language
Computers build helpful screens to answer questions.