Coverage-Guided Testing for Deep Learning Models: A Comprehensive Survey
By: Hongjing Guo , Chuanqi Tao , Zhiqiu Huang and more
Potential Business Impact:
Tests computer brains to find mistakes.
As Deep Learning (DL) models are increasingly applied in safety-critical domains, ensuring their quality has emerged as a pressing challenge in modern software engineering. Among emerging validation paradigms, coverage-guided testing (CGT) has gained prominence as a systematic framework for identifying erroneous or unexpected model behaviors. Despite growing research attention, existing CGT studies remain methodologically fragmented, limiting the understanding of current advances and emerging trends. This work addresses that gap through a comprehensive review of state-of-the-art CGT methods for DL models, including test coverage analysis, coverage-guided test input generation, and coverage-guided test input optimization. This work provides detailed taxonomies to organize these methods based on methodological characteristics and application scenarios. We also investigate evaluation practices adopted in existing studies, including the use of benchmark datasets, model architectures, and evaluation aspects. Finally, open challenges and future directions are highlighted in terms of the correlation between structural coverage and testing objectives, method generalizability across tasks and models, practical deployment concerns, and the need for standardized evaluation and tool support. This work aims to provide a roadmap for future academic research and engineering practice in DL model quality assurance.
Similar Papers
Towards Understanding Deep Learning Model in Image Recognition via Coverage Test
CV and Pattern Recognition
Tests AI brains to find hidden problems.
Evaluating the Effectiveness of Coverage-Guided Fuzzing for Testing Deep Learning Library APIs
Software Engineering
Finds bugs in AI programs faster.
TestDG: Test-time Domain Generalization for Continual Test-time Adaptation
CV and Pattern Recognition
Helps AI remember old lessons when learning new things.