NeuroLoc: Encoding Navigation Cells for 6-DOF Camera Localization
By: Xun Li , Jian Yang , Fenli Jia and more
Potential Business Impact:
Helps robots find their way using a brain-like system.
Recently, camera localization has been widely adopted in autonomous robotic navigation due to its efficiency and convenience. However, autonomous navigation in unknown environments often suffers from scene ambiguity, environmental disturbances, and dynamic object transformation in camera localization. To address this problem, inspired by the biological brain navigation mechanism (such as grid cells, place cells, and head direction cells), we propose a novel neurobiological camera location method, namely NeuroLoc. Firstly, we designed a Hebbian learning module driven by place cells to save and replay historical information, aiming to restore the details of historical representations and solve the issue of scene fuzziness. Secondly, we utilized the head direction cell-inspired internal direction learning as multi-head attention embedding to help restore the true orientation in similar scenes. Finally, we added a 3D grid center prediction in the pose regression module to reduce the final wrong prediction. We evaluate the proposed NeuroLoc on commonly used benchmark indoor and outdoor datasets. The experimental results show that our NeuroLoc can enhance the robustness in complex environments and improve the performance of pose regression by using only a single image.
Similar Papers
Endowing Embodied Agents with Spatial Reasoning Capabilities for Vision-and-Language Navigation
Artificial Intelligence
Helps robots see and move without getting lost.
Vision-Based Localization and LLM-based Navigation for Indoor Environments
Machine Learning (CS)
Guides you indoors using phone camera and AI.
EDEN: Entorhinal Driven Egocentric Navigation Toward Robotic Deployment
Robotics
Helps robots navigate like a brain.