AgriGS-SLAM: Orchard Mapping Across Seasons via Multi-View Gaussian Splatting SLAM
By: Mirko Usuelli , David Rapado-Rincon , Gert Kootstra and more
Potential Business Impact:
Robots see orchards better, even in wind.
Autonomous robots in orchards require real-time 3D scene understanding despite repetitive row geometry, seasonal appearance changes, and wind-driven foliage motion. We present AgriGS-SLAM, a Visual--LiDAR SLAM framework that couples direct LiDAR odometry and loop closures with multi-camera 3D Gaussian Splatting (3DGS) rendering. Batch rasterization across complementary viewpoints recovers orchard structure under occlusions, while a unified gradient-driven map lifecycle executed between keyframes preserves fine details and bounds memory. Pose refinement is guided by a probabilistic LiDAR-based depth consistency term, back-propagated through the camera projection to tighten geometry-appearance coupling. We deploy the system on a field platform in apple and pear orchards across dormancy, flowering, and harvesting, using a standardized trajectory protocol that evaluates both training-view and novel-view synthesis to reduce 3DGS overfitting in evaluation. Across seasons and sites, AgriGS-SLAM delivers sharper, more stable reconstructions and steadier trajectories than recent state-of-the-art 3DGS-SLAM baselines while maintaining real-time performance on-tractor. While demonstrated in orchard monitoring, the approach can be applied to other outdoor domains requiring robust multimodal perception.
Similar Papers
A Survey on Collaborative SLAM with 3D Gaussian Splatting
Robotics
Robots map places together faster and better.
Rad-GS: Radar-Vision Integration for 3D Gaussian Splatting SLAM in Outdoor Environments
CV and Pattern Recognition
Maps large outdoor areas accurately with radar.
GigaSLAM: Large-Scale Monocular SLAM with Hierarchical Gaussian Splats
Robotics
Maps huge outdoor areas for robots.