Palpation Alters Auditory Pain Expressions with Gender-Specific Variations in Robopatients
By: Chapa Sirithunge , Yue Xie , Saitarun Nadipineni and more
Potential Business Impact:
Robots learn to make realistic pain sounds for training.
Diagnostic errors remain a major cause of preventable deaths, particularly in resource-limited regions. Medical training simulators, including robopatients, play a vital role in reducing these errors by mimicking real patients for procedural training such as palpation. However, generating multimodal feedback, especially auditory pain expressions, remains challenging due to the complex relationship between palpation behavior and sound. The high-dimensional nature of pain sounds makes exploration challenging with conventional methods. This study introduces a novel experimental paradigm for pain expressivity in robopatients where they dynamically generate auditory pain expressions in response to palpation force, by co-optimizing human feedback using machine learning. Using Proximal Policy Optimization (PPO), a reinforcement learning (RL) technique optimized for continuous adaptation, our robot iteratively refines pain sounds based on real-time human feedback. This robot initializes randomized pain responses to palpation forces, and the RL agent learns to adjust these sounds to align with human preferences. The results demonstrated that the system adapts to an individual's palpation forces and sound preferences and captures a broad spectrum of pain intensity, from mild discomfort to acute distress, through RL-guided exploration of the auditory pain space. The study further showed that pain sound perception exhibits saturation at lower forces with gender specific thresholds. These findings highlight the system's potential to enhance abdominal palpation training by offering a controllable and immersive simulation platform.
Similar Papers
Auditory-Tactile Congruence for Synthesis of Adaptive Pain Expressions in RoboPatients
Robotics
Teaches doctors to find sickness using fake patients.
Toward Artificial Palpation: Representation Learning of Touch on Soft Bodies
Machine Learning (CS)
Robot doctors feel and see inside bodies.
Differential Analysis of Pseudo Haptic Feedback: Novel Comparative Study of Visual and Auditory Cue Integration for Psychophysical Evaluation
Human-Computer Interaction
Makes screens feel bumpy with sound and pictures.