Where is the Boundary: Multimodal Sensor Fusion Test Bench for Tissue Boundary Delineation
By: Zacharias Chen , Alexa Cristelle Cahilig , Sarah Dias and more
Potential Business Impact:
Helps robots feel and see tumors during surgery.
Robot-assisted neurological surgery is receiving growing interest due to the improved dexterity, precision, and control of surgical tools, which results in better patient outcomes. However, such systems often limit surgeons' natural sensory feedback, which is crucial in identifying tissues -- particularly in oncological procedures where distinguishing between healthy and tumorous tissue is vital. While imaging and force sensing have addressed the lack of sensory feedback, limited research has explored multimodal sensing options for accurate tissue boundary delineation. We present a user-friendly, modular test bench designed to evaluate and integrate complementary multimodal sensors for tissue identification. Our proposed system first uses vision-based guidance to estimate boundary locations with visual cues, which are then refined using data acquired by contact microphones and a force sensor. Real-time data acquisition and visualization are supported via an interactive graphical interface. Experimental results demonstrate that multimodal fusion significantly improves material classification accuracy. The platform provides a scalable hardware-software solution for exploring sensor fusion in surgical applications and demonstrates the potential of multimodal approaches in real-time tissue boundary delineation.
Similar Papers
Touching the tumor boundary: A pilot study on ultrasound based virtual fixtures for breast-conserving surgery
Robotics
Helps surgeons cut out cancer more precisely.
Estimation of Tissue Deformation and Interactive Force in Robotic Surgery through Vision-based Learning
Systems and Control
Helps surgeons feel and see inside bodies better.
Design and Benchmarking of A Multi-Modality Sensor for Robotic Manipulation with GAN-Based Cross-Modality Interpretation
Robotics
Lets robots feel and see objects better.