Designing for Novice Debuggers: A Pilot Study on an AI-Assisted Debugging Tool
By: Oka Kurniawan , Erick Chandra , Christopher M. Poskitt and more
Potential Business Impact:
Helps students fix code errors by thinking.
Debugging is a fundamental skill that novice programmers must develop. Numerous tools have been created to assist novice programmers in this process. Recently, large language models (LLMs) have been integrated with automated program repair techniques to generate fixes for students' buggy code. However, many of these tools foster an over-reliance on AI and do not actively engage students in the debugging process. In this work, we aim to design an intuitive debugging assistant, CodeHinter, that combines traditional debugging tools with LLM-based techniques to help novice debuggers fix semantic errors while promoting active engagement in the debugging process. We present findings from our second design iteration, which we tested with a group of undergraduate students. Our results indicate that the students found the tool highly effective in resolving semantic errors and significantly easier to use than the first version. Consistent with our previous study, error localization was the most valuable feature. Finally, we conclude that any AI-assisted debugging tool should be personalized based on user profiles to optimize their interactions with students.
Similar Papers
Designing for Novice Debuggers: A Pilot Study on an AI-Assisted Debugging Tool
Software Engineering
Helps students fix code errors by thinking.
Interactive Debugging and Steering of Multi-Agent AI Systems
Multiagent Systems
Helps fix AI teams that work together.
Who's the Leader? Analyzing Novice Workflows in LLM-Assisted Debugging of Machine Learning Code
Human-Computer Interaction
Helps beginners learn coding without over-relying.