ABLE: Using Adversarial Pairs to Construct Local Models for Explaining Model Predictions
By: Krishna Khadka , Sunny Shree , Pujan Budhathoki and more
Potential Business Impact:
Explains how "black box" computer decisions work.
Machine learning models are increasingly used in critical applications but are mostly "black boxes" due to their lack of transparency. Local explanation approaches, such as LIME, address this issue by approximating the behavior of complex models near a test instance using simple, interpretable models. However, these approaches often suffer from instability and poor local fidelity. In this paper, we propose a novel approach called Adversarially Bracketed Local Explanation (ABLE) to address these limitations. Our approach first generates a set of neighborhood points near the test instance, x_test, by adding bounded Gaussian noise. For each neighborhood point D, we apply an adversarial attack to generate an adversarial point A with minimal perturbation that results in a different label than D. A second adversarial attack is then performed on A to generate a point A' that has the same label as D (and thus different than A). The points A and A' form an adversarial pair that brackets the local decision boundary for x_test. We then train a linear model on these adversarial pairs to approximate the local decision boundary. Experimental results on six UCI benchmark datasets across three deep neural network architectures demonstrate that our approach achieves higher stability and fidelity than the state-of-the-art.
Similar Papers
SHLIME: Foiling adversarial attacks fooling SHAP and LIME
Machine Learning (CS)
Finds hidden unfairness in AI decisions.
ITL-LIME: Instance-Based Transfer Learning for Enhancing Local Explanations in Low-Resource Data Settings
Artificial Intelligence
Helps AI explain its decisions better with less data.
ITL-LIME: Instance-Based Transfer Learning for Enhancing Local Explanations in Low-Resource Data Settings
Artificial Intelligence
Makes AI explain itself better with less data.