Score: 1

Formally Explaining Decision Tree Models with Answer Set Programming

Published: January 7, 2026 | arXiv ID: 2601.03845v1

By: Akihiro Takemura, Masayuki Otani, Katsumi Inoue

Potential Business Impact:

Explains complex computer guesses for safety.

Business Areas:
Natural Language Processing Artificial Intelligence, Data and Analytics, Software

Decision tree models, including random forests and gradient-boosted decision trees, are widely used in machine learning due to their high predictive performance. However, their complex structures often make them difficult to interpret, especially in safety-critical applications where model decisions require formal justification. Recent work has demonstrated that logical and abductive explanations can be derived through automated reasoning techniques. In this paper, we propose a method for generating various types of explanations, namely, sufficient, contrastive, majority, and tree-specific explanations, using Answer Set Programming (ASP). Compared to SAT-based approaches, our ASP-based method offers greater flexibility in encoding user preferences and supports enumeration of all possible explanations. We empirically evaluate the approach on a diverse set of datasets and demonstrate its effectiveness and limitations compared to existing methods.

Country of Origin
🇯🇵 Japan

Repos / Data Links

Page Count
18 pages

Category
Computer Science:
Artificial Intelligence