Who is responsible? Social Identity, Robot Errors and Blame Attribution
By: Samantha Stedtler, Marianna Leventi
Potential Business Impact:
Robots might blame people unfairly.
This paper argues that conventional blame practices fall short of capturing the complexity of moral experiences, neglecting power dynamics and discriminatory social practices. It is evident that robots, embodying roles linked to specific social groups, pose a risk of reinforcing stereotypes of how these groups behave or should behave, so they set a normative and descriptive standard. In addition, we argue that faulty robots might create expectations of who is supposed to compensate and repair after their errors, where social groups that are already disadvantaged might be blamed disproportionately if they do not act according to their ascribed roles. This theoretical and empirical gap becomes even more urgent to address as there have been indications of potential carryover effects from Human-Robot Interactions (HRI) to Human-Human Interactions (HHI). We therefore urge roboticists and designers to stay in an ongoing conversation about how social traits are conceptualised and implemented in this technology. We also argue that one solution could be to 'embrace the glitch' and to focus on constructively disrupting practices instead of prioritizing efficiency and smoothness of interaction above everything else. Apart from considering ethical aspects in the design phase of social robots, we see our analysis as a call for more research on the consequences of robot stereotyping and blame attribution.
Similar Papers
Rude Humans and Vengeful Robots: Examining Human Perceptions of Robot Retaliatory Intentions in Professional Settings
Robotics
Robots learn to handle rude coworkers better.
A Comprehensive Framework to Operationalize Social Stereotypes for Responsible AI Evaluations
Computers and Society
Fixes AI to be fair to everyone.
From Framework to Reliable Practice: End-User Perspectives on Social Robots in Public Spaces
Robotics
Helps robots be safe, private, and fair.