Evaluation of LLM-based Explanations for a Learning Analytics Dashboard
By: Alina Deriyeva, Benjamin Paassen
Potential Business Impact:
Makes learning dashboards easier to understand.
Learning Analytics Dashboards can be a powerful tool to support self-regulated learning in Digital Learning Environments and promote development of meta-cognitive skills, such as reflection. However, their effectiveness can be affected by the interpretability of the data they provide. To assist in the interpretation, we employ a large language model to generate verbal explanations of the data in the dashboard and evaluate it against a standalone dashboard and explanations provided by human teachers in an expert study with university level educators (N=12). We find that the LLM-based explanations of the skill state presented in the dashboard, as well as general recommendations on how to proceed with learning within the course are significantly more favored compared to the other conditions. This indicates that using LLMs for interpretation purposes can enhance the learning experience for learners while maintaining the pedagogical standards approved by teachers.
Similar Papers
When learning analytics dashboard is explainable: An exploratory study on the effect of GenAI-supported learning analytics dashboard
Human-Computer Interaction
Helps students understand writing better with AI.
Interpretability Framework for LLMs in Undergraduate Calculus
Computers and Society
Checks math answers by understanding how they're solved.
Automated Visualization Makeovers with LLMs
Human-Computer Interaction
Helps make charts easier to understand.