Tracing Distribution Shifts with Causal System Maps
By: Joran Leest , Ilias Gerostathopoulos , Patricia Lago and more
Potential Business Impact:
Finds why computer learning makes mistakes.
Monitoring machine learning (ML) systems is hard, with standard practice focusing on detecting distribution shifts rather than their causes. Root-cause analysis often relies on manual tracing to determine whether a shift is caused by software faults, data-quality issues, or natural change. We propose ML System Maps -- causal maps that, through layered views, make explicit the propagation paths between the environment and the ML system's internals, enabling systematic attribution of distribution shifts. We outline the approach and a research agenda for its development and evaluation.
Similar Papers
From Tea Leaves to System Maps: A Survey and Framework on Context-aware Machine Learning Monitoring
Software Engineering
Helps AI understand why it's making mistakes.
Online Identification of IT Systems through Active Causal Learning
Machine Learning (CS)
Helps computers learn how systems work automatically.
Online Identification of IT Systems through Active Causal Learning
Machine Learning (CS)
Teaches computers how systems work automatically.