Multimodal Spiking Neural Network for Space Robotic Manipulation
By: Liwen Zhang , Dong Zhou , Shibo Shao and more
Potential Business Impact:
Robots in space learn to grab and move things.
This paper presents a multimodal control framework based on spiking neural networks (SNNs) for robotic arms aboard space stations. It is designed to cope with the constraints of limited onboard resources while enabling autonomous manipulation and material transfer in space operations. By combining geometric states with tactile and semantic information, the framework strengthens environmental awareness and contributes to more robust control strategies. To guide the learning process progressively, a dual-channel, three-stage curriculum reinforcement learning (CRL) scheme is further integrated into the system. The framework was tested across a range of tasks including target approach, object grasping, and stable lifting with wall-mounted robotic arms, demonstrating reliable performance throughout. Experimental evaluations demonstrate that the proposed method consistently outperforms baseline approaches in both task success rate and energy efficiency. These findings highlight its suitability for real-world aerospace applications.
Similar Papers
Fully Spiking Actor-Critic Neural Network for Robotic Manipulation
Robotics
Robots learn to grab things faster, using less power.
CBMC-V3: A CNS-inspired Control Framework Towards Manipulation Agility with SNN
Robotics
Robots move more smoothly and quickly.
Spiking Neural Networks for Continuous Control via End-to-End Model-Based Learning
Robotics
Robots learn to move arms smoothly and accurately.