Adaptive Science Operations in Deep Space Missions Using Offline Belief State Planning
By: Grace Ra Kim , Hailey Warner , Duncan Eddy and more
Potential Business Impact:
Spacecraft pick best science tools automatically.
Deep space missions face extreme communication delays and environmental uncertainty that prevent real-time ground operations. To support autonomous science operations in communication-constrained environments, we present a partially observable Markov decision process (POMDP) framework that adaptively sequences spacecraft science instruments. We integrate a Bayesian network into the POMDP observation space to manage the high-dimensional and uncertain measurements typical of astrobiology missions. This network compactly encodes dependencies among measurements and improves the interpretability and computational tractability of science data. Instrument operation policies are computed offline, allowing resource-aware plans to be generated and thoroughly validated prior to launch. We use the Enceladus Orbilander's proposed Life Detection Suite (LDS) as a case study, demonstrating how Bayesian network structure and reward shaping influence system performance. We compare our method against the mission's baseline Concept of Operations (ConOps), evaluating both misclassification rates and performance in off-nominal sample accumulation scenarios. Our approach reduces sample identification errors by nearly 40%
Similar Papers
Belief-Conditioned One-Step Diffusion: Real-Time Trajectory Planning with Just-Enough Sensing
Robotics
Robots use less power by turning sensors on/off.
Verifiable Mission Planning For Space Operations
Systems and Control
Keeps space missions safe while doing their job.
Markov Decision Processes for Satellite Maneuver Planning and Collision Avoidance
Robotics
Saves satellite fuel by planning smarter moves.