Terrain-Adaptive Mobile 3D Printing with Hierarchical Control
By: Shuangshan Nors Li, J. Nathan Kutz
Mobile 3D printing on unstructured terrain remains challenging due to the conflict between platform mobility and deposition precision. Existing gantry-based systems achieve high accuracy but lack mobility, while mobile platforms struggle to maintain print quality on uneven ground. We present a framework that tightly integrates AI-driven disturbance prediction with multi-modal sensor fusion and hierarchical hardware control, forming a closed-loop perception-learning-actuation system. The AI module learns terrain-to-perturbation mappings from IMU, vision, and depth sensors, enabling proactive compensation rather than reactive correction. This intelligence is embedded into a three-layer control architecture: path planning, predictive chassis-manipulator coordination, and precision hardware execution. Through outdoor experiments on terrain with slopes and surface irregularities, we demonstrate sub-centimeter printing accuracy while maintaining full platform mobility. This AI-hardware integration establishes a practical foundation for autonomous construction in unstructured environments.
Similar Papers
An adaptive hierarchical control framework for quadrupedal robots in planetary exploration
Robotics
Lets robots walk on any planet surface.
Gait-Adaptive Perceptive Humanoid Locomotion with Real-Time Under-Base Terrain Reconstruction
Robotics
Robots can now climb stairs and cross gaps.
ProTerrain: Probabilistic Physics-Informed Rough Terrain World Modeling
Robotics
Helps robots safely drive on bumpy ground.