Uncertainty-driven Adaptive Exploration
By: Leonidas Bakopoulos, Georgios Chalkiadakis
Potential Business Impact:
Teaches robots when to try new things.
Adaptive exploration methods propose ways to learn complex policies via alternating between exploration and exploitation. An important question for such methods is to determine the appropriate moment to switch between exploration and exploitation and vice versa. This is critical in domains that require the learning of long and complex sequences of actions. In this work, we present a generic adaptive exploration framework that employs uncertainty to address this important issue in a principled manner. Our framework includes previous adaptive exploration approaches as special cases. Moreover, we can incorporate in our framework any uncertainty-measuring mechanism of choice, for instance mechanisms used in intrinsic motivation or epistemic uncertainty-based exploration methods. We experimentally demonstrate that our framework gives rise to adaptive exploration strategies that outperform standard ones across several MuJoCo environments.
Similar Papers
An Adaptive, Data-Integrated Agent-Based Modeling Framework for Explainable and Contestable Policy Design
Multiagent Systems
Helps computer groups learn and adapt together.
Learning Soft Robotic Dynamics with Active Exploration
Robotics
Teaches soft robots to learn any new task.
Beyond Relevance: An Adaptive Exploration-Based Framework for Personalized Recommendations
Information Retrieval
Shows you new, interesting things you'll like.