Is open robotics innovation a threat to international peace and security?
By: Ludovic Righetti, Vincent Boulanin
Potential Business Impact:
Guides robot makers to build safely.
Open access to publication, software and hardware is central to robotics: it lowers barriers to entry, supports reproducible science and accelerates reliable system development. However, openness also exacerbates the inherent dual-use risks associated with research and innovation in robotics. It lowers barriers for states and non-state actors to develop and deploy robotics systems for military use and harmful purposes. Compared to other fields of engineering where dual-use risks are present - e.g., those that underlie the development of weapons of mass destruction (chemical, biological, radiological, and nuclear weapons) and even the field of AI, robotics offers no specific regulation and little guidance as to how research and innovation may be conducted and disseminated responsibly. While other fields can be used for guidance, robotics has its own needs and specificities which have to be taken into account. The robotics community should therefore work toward its own set of sector-specific guidance and possibly regulation. To that end, we propose a roadmap focusing on four practices: a) education in responsible robotics; b) incentivizing risk assessment; c) moderating the diffusion of high-risk material; and d) developing red lines.
Similar Papers
A roadmap for AI in robotics
Robotics
Robots learn to do more jobs safely.
Safety is Essential for Responsible Open-Ended Systems
Artificial Intelligence
AI learns new things but can become unpredictable.
Software Engineering for Self-Adaptive Robotics: A Research Agenda
Software Engineering
Makes robots learn and fix themselves.