Evaluating the Pre-Dressing Step: Unfolding Medical Garments Via Imitation Learning
By: David Blanco-Mulero, Júlia Borràs, Carme Torras
Potential Business Impact:
Robots learn to unfold clothes for easier dressing.
Robotic-assisted dressing has the potential to significantly aid both patients as well as healthcare personnel, reducing the workload and improving the efficiency in clinical settings. While substantial progress has been made in robotic dressing assistance, prior works typically assume that garments are already unfolded and ready for use. However, in medical applications gowns and aprons are often stored in a folded configuration, requiring an additional unfolding step. In this paper, we introduce the pre-dressing step, the process of unfolding garments prior to assisted dressing. We leverage imitation learning for learning three manipulation primitives, including both high and low acceleration motions. In addition, we employ a visual classifier to categorise the garment state as closed, partly opened, and fully opened. We conduct an empirical evaluation of the learned manipulation primitives as well as their combinations. Our results show that highly dynamic motions are not effective for unfolding freshly unpacked garments, where the combination of motions can efficiently enhance the opening configuration.
Similar Papers
Bimanual Robot-Assisted Dressing: A Spherical Coordinate-Based Strategy for Tight-Fitting Garments
Robotics
Robots help people dress in tight clothes.
Force-Modulated Visual Policy for Robot-Assisted Dressing with Arm Motions
Robotics
Robot helps people dress by moving with them.
Robotic Automation in Apparel Manufacturing: A Novel Approach to Fabric Handling and Sewing
Robotics
Robots can now sew clothes by making fabric stiff.