Explainable Scene Understanding with Qualitative Representations and Graph Neural Networks
By: Nassim Belmecheri , Arnaud Gotlieb , Nadjib Lazaar and more
Potential Business Impact:
Helps self-driving cars understand why others move.
This paper investigates the integration of graph neural networks (GNNs) with Qualitative Explainable Graphs (QXGs) for scene understanding in automated driving. Scene understanding is the basis for any further reactive or proactive decision-making. Scene understanding and related reasoning is inherently an explanation task: why is another traffic participant doing something, what or who caused their actions? While previous work demonstrated QXGs' effectiveness using shallow machine learning models, these approaches were limited to analysing single relation chains between object pairs, disregarding the broader scene context. We propose a novel GNN architecture that processes entire graph structures to identify relevant objects in traffic scenes. We evaluate our method on the nuScenes dataset enriched with DriveLM's human-annotated relevance labels. Experimental results show that our GNN-based approach achieves superior performance compared to baseline methods. The model effectively handles the inherent class imbalance in relevant object identification tasks while considering the complete spatial-temporal relationships between all objects in the scene. Our work demonstrates the potential of combining qualitative representations with deep learning approaches for explainable scene understanding in autonomous driving systems.
Similar Papers
Explaining Vision GNNs: A Semantic and Visual Analysis of Graph-based Image Classification
CV and Pattern Recognition
Shows how computers "see" pictures to make decisions.
Two Birds with One Stone: Enhancing Uncertainty Quantification and Interpretability with Graph Functional Neural Process
Machine Learning (CS)
Helps computers explain why they make graph decisions.
Enhancing Explainability of Graph Neural Networks Through Conceptual and Structural Analyses and Their Extensions
Artificial Intelligence
Explains how computer graphs make decisions.