Diversity and Inclusion in AI: Insights from a Survey of AI/ML Practitioners
By: Sidra Malik, Muneera Bano, Didar Zowghi
Potential Business Impact:
Makes AI fair and trustworthy for everyone.
Growing awareness of social biases and inequalities embedded in Artificial Intelligence (AI) systems has brought increased attention to the integration of Diversity and Inclusion (D&I) principles throughout the AI lifecycle. Despite the rise of ethical AI guidelines, there is limited empirical evidence on how D&I is applied in real-world settings. This study explores how AI and Machine Learning(ML) practitioners perceive and implement D&I principles and identifies organisational challenges that hinder their effective adoption. Using a mixed-methods approach, we surveyed industry professionals, collecting both quantitative and qualitative data on current practices, perceived impacts, and challenges related to D&I in AI. While most respondents recognise D&I as essential for mitigating bias and enhancing fairness, practical implementation remains inconsistent. Our analysis revealed a disconnect between perceived benefits and current practices, with major barriers including the under-representation of marginalised groups, lack of organisational transparency, and limited awareness among early-career professionals. Despite these barriers, respondents widely agree that diverse teams contribute to ethical, trustworthy, and innovative AI systems. By underpinning the key pain points and areas requiring improvement, this study highlights the need to bridge the gap between D&I principles and real-world AI development practices.
Similar Papers
Understanding Ethical Practices in AI: Insights from a Cross-Role, Cross-Region Survey of AI Development Teams
Computers and Society
Helps make AI safer and more fair.
Enduring Disparities in the Workplace: A Pilot Study in the AI Community
Computers and Society
Makes AI fairer for everyone at work.
AI in Support of Diversity and Inclusion
Artificial Intelligence
Makes AI understand everyone, not just some people.