The User-first Approach to AI Ethics: Preferences for Ethical Principles in AI Systems across Cultures and Contexts
By: Benjamin J. Carroll , Jianlong Zhou , Paul F. Burke and more
Potential Business Impact:
Helps make AI fair for everyone, everywhere.
As AI systems increasingly permeate everyday life, designers and developers face mounting pressure to balance innovation with ethical design choices. To date, the operationalisation of AI ethics has predominantly depended on frameworks that prescribe which ethical principles should be embedded within AI systems. However, the extent to which users value these principles remains largely unexplored in the existing literature. In a discrete choice experiment conducted in four countries, we quantify user preferences for 11 ethical principles. Our findings indicate that, while users generally prioritise privacy, justice & fairness, and transparency, their preferences exhibit significant variation based on culture and application context. Latent class analysis further revealed four distinct user cohorts, the largest of which is ethically disengaged and defers to regulatory oversight. Our findings offer (1) empirical evidence of uneven user prioritisation of AI ethics principles, (2) actionable guidance for operationalising ethics tailored to culture and context, (3) support for the development of robust regulatory mechanisms, and (4) a foundation for advancing a user-centred approach to AI ethics, motivated independently from abstract moral theory.
Similar Papers
The User-first Approach to AI Ethics: Preferences for Ethical Principles in AI Systems across Cultures and Contexts
Human-Computer Interaction
Helps build AI that people actually want.
Do Ethical AI Principles Matter to Users? A Large-Scale Analysis of User Sentiment and Satisfaction
Human-Computer Interaction
Makes AI more liked by people using it.
Towards Aligning Personalized Conversational Recommendation Agents with Users' Privacy Preferences
Human-Computer Interaction
AI learns your privacy rules to protect you.