Advancing Fetal Ultrasound Image Quality Assessment in Low-Resource Settings
By: Dongli He, Hu Wang, Mohammad Yaqub
Potential Business Impact:
Helps doctors check babies in the womb better.
Accurate fetal biometric measurements, such as abdominal circumference, play a vital role in prenatal care. However, obtaining high-quality ultrasound images for these measurements heavily depends on the expertise of sonographers, posing a significant challenge in low-income countries due to the scarcity of trained personnel. To address this issue, we leverage FetalCLIP, a vision-language model pretrained on a curated dataset of over 210,000 fetal ultrasound image-caption pairs, to perform automated fetal ultrasound image quality assessment (IQA) on blind-sweep ultrasound data. We introduce FetalCLIP$_{CLS}$, an IQA model adapted from FetalCLIP using Low-Rank Adaptation (LoRA), and evaluate it on the ACOUSLIC-AI dataset against six CNN and Transformer baselines. FetalCLIP$_{CLS}$ achieves the highest F1 score of 0.757. Moreover, we show that an adapted segmentation model, when repurposed for classification, further improves performance, achieving an F1 score of 0.771. Our work demonstrates how parameter-efficient fine-tuning of fetal ultrasound foundation models can enable task-specific adaptations, advancing prenatal care in resource-limited settings. The experimental code is available at: https://github.com/donglihe-hub/FetalCLIP-IQA.
Similar Papers
FetalCLIP: A Visual-Language Foundation Model for Fetal Ultrasound Image Analysis
Image and Video Processing
Helps doctors spot baby problems with ultrasound pictures.
Automatic Quality Assessment of First Trimester Crown-Rump-Length Ultrasound Images
CV and Pattern Recognition
Helps doctors measure baby's age accurately from scans.
Towards Objective Obstetric Ultrasound Assessment: Contrastive Representation Learning for Fetal Movement Detection
CV and Pattern Recognition
Helps doctors watch babies move in the womb.