Student Engagement with GenAI's Tutoring Feedback: A Mixed Methods Study
By: Sven Jacobs, Jan Haas, Natalie Kiesler
Potential Business Impact:
Helps computers give better coding help to students.
How students utilize immediate tutoring feedback in programming education depends on various factors. Among them are the feedback quality, but also students' engagement, i.e., their perception, interpretation, and use of feedback. However, there is limited research on how students engage with various types of tutoring feedback. For this reason, we developed a learning environment that provides students with Python programming tasks and various types of immediate, AI-generated tutoring feedback. The feedback is displayed within four components. Using a mixed-methods approach (think-aloud study and eye-tracking), we conducted a study with 20 undergraduate students enrolled in an introductory programming course. Our research aims to: (1) identify what students think when they engage with the tutoring feedback components, and (2) explore the relations between the tutoring feedback components, students' visual attention, verbalized thoughts, and their immediate actions as part of the problem-solving process. The analysis of students' thoughts while engaging with 380 feedback components revealed four main themes: students express understanding or disagreement, additional information needed, and students explicitly judge the feedback. Exploring the relations between feedback, students' attention, thoughts, and actions showed a clear relationship. While expressions of understanding were associated with improvements, expressions of disagreement or need for additional information prompted students to collect another feedback component rather than act on the current information. These insights into students' engagement and decision-making processes contribute to an increased understanding of tutoring feedback and how students engage with it. Thereby, this work has implications for tool developers and educators facilitating feedback.
Similar Papers
Directive, Metacognitive or a Blend of Both? A Comparison of AI-Generated Feedback Types on Student Engagement, Confidence, and Outcomes
Human-Computer Interaction
AI helps students learn better by guiding them.
Scaffolding Metacognition in Programming Education: Understanding Student-AI Interactions and Design Implications
Human-Computer Interaction
Helps AI teach students to code better.
Humanizing Automated Programming Feedback: Fine-Tuning Generative Models with Student-Written Feedback
Computers and Society
Teaches computers to give better coding help.