CT-GRAPH: Hierarchical Graph Attention Network for Anatomy-Guided CT Report Generation
By: Hamza Kalisch , Fabian Hörst , Jens Kleesiek and more
Potential Business Impact:
Helps doctors write patient scan reports faster.
As medical imaging is central to diagnostic processes, automating the generation of radiology reports has become increasingly relevant to assist radiologists with their heavy workloads. Most current methods rely solely on global image features, failing to capture fine-grained organ relationships crucial for accurate reporting. To this end, we propose CT-GRAPH, a hierarchical graph attention network that explicitly models radiological knowledge by structuring anatomical regions into a graph, linking fine-grained organ features to coarser anatomical systems and a global patient context. Our method leverages pretrained 3D medical feature encoders to obtain global and organ-level features by utilizing anatomical masks. These features are further refined within the graph and then integrated into a large language model to generate detailed medical reports. We evaluate our approach for the task of report generation on the large-scale chest CT dataset CT-RATE. We provide an in-depth analysis of pretrained feature encoders for CT report generation and show that our method achieves a substantial improvement of absolute 7.9\% in F1 score over current state-of-the-art methods. The code is publicly available at https://github.com/hakal104/CT-GRAPH.
Similar Papers
Attention Maps in 3D Shape Classification for Dental Stage Estimation with Class Node Graph Attention Networks
CV and Pattern Recognition
Shows how computers understand tooth growth.
Graph Neural Networks for Surgical Scene Segmentation
CV and Pattern Recognition
Helps surgeons see hidden body parts during operations.
R2GenKG: Hierarchical Multi-modal Knowledge Graph for LLM-based Radiology Report Generation
CV and Pattern Recognition
Helps computers write better X-ray reports.