Score: 1

Adoption of Explainable Natural Language Processing: Perspectives from Industry and Academia on Practices and Challenges

Published: August 13, 2025 | arXiv ID: 2508.09786v1

By: Mahdi Dhaini , Tobias Müller , Roksoliana Rabets and more

BigTech Affiliations: SAP

Potential Business Impact:

Helps people understand how AI makes decisions.

The field of explainable natural language processing (NLP) has grown rapidly in recent years. The growing opacity of complex models calls for transparency and explanations of their decisions, which is crucial to understand their reasoning and facilitate deployment, especially in high-stakes environments. Despite increasing attention given to explainable NLP, practitioners' perspectives regarding its practical adoption and effectiveness remain underexplored. This paper addresses this research gap by investigating practitioners' experiences with explainability methods, specifically focusing on their motivations for adopting such methods, the techniques employed, satisfaction levels, and the practical challenges encountered in real-world NLP applications. Through a qualitative interview-based study with industry practitioners and complementary interviews with academic researchers, we systematically analyze and compare their perspectives. Our findings reveal conceptual gaps, low satisfaction with current explainability methods, and highlight evaluation challenges. Our findings emphasize the need for clear definitions and user-centric frameworks for better adoption of explainable NLP in practice.

Country of Origin
🇩🇪 Germany

Page Count
12 pages

Category
Computer Science:
Computation and Language