Formally Explaining Decision Tree Models with Answer Set Programming
By: Akihiro Takemura, Masayuki Otani, Katsumi Inoue
Potential Business Impact:
Explains complex computer guesses for safety.
Decision tree models, including random forests and gradient-boosted decision trees, are widely used in machine learning due to their high predictive performance. However, their complex structures often make them difficult to interpret, especially in safety-critical applications where model decisions require formal justification. Recent work has demonstrated that logical and abductive explanations can be derived through automated reasoning techniques. In this paper, we propose a method for generating various types of explanations, namely, sufficient, contrastive, majority, and tree-specific explanations, using Answer Set Programming (ASP). Compared to SAT-based approaches, our ASP-based method offers greater flexibility in encoding user preferences and supports enumeration of all possible explanations. We empirically evaluate the approach on a diverse set of datasets and demonstrate its effectiveness and limitations compared to existing methods.
Similar Papers
xDNN(ASP): Explanation Generation System for Deep Neural Networks powered by Answer Set Programming
Artificial Intelligence
Shows how computer brains make decisions.
Relating Answer Set Programming and Many-sorted Logics for Formal Verification
Logic in Computer Science
Makes smart computer programs easier to check.
Most General Explanations of Tree Ensembles (Extended Version)
Artificial Intelligence
Shows why computers make choices.