Data Bias in Human Mobility is a Universal Phenomenon but is Highly Location-specific
By: Katinka den Nijs, Elisa Omodei, Vedran Sekara
Potential Business Impact:
Shows how phone data unfairly favors rich people.
Large-scale human mobility datasets play increasingly critical roles in many algorithmic systems, business processes and policy decisions. Unfortunately there has been little focus on understanding bias and other fundamental shortcomings of the datasets and how they impact downstream analyses and prediction tasks. In this work, we study `data production', quantifying not only whether individuals are represented in big digital datasets, but also how they are represented in terms of how much data they produce. We study GPS mobility data collected from anonymized smartphones for ten major US cities and find that data points can be more unequally distributed between users than wealth. We build models to predict the number of data points we can expect to be produced by the composition of demographic groups living in census tracts, and find strong effects of wealth, ethnicity, and education on data production. While we find that bias is a universal phenomenon, occurring in all cities, we further find that each city suffers from its own manifestation of it, and that location-specific models are required to model bias for each city. This work raises serious questions about general approaches to debias human mobility data and urges further research.
Similar Papers
A systematic machine learning approach to measure and assess biases in mobile phone population data
Applications
Fixes phone data to show where everyone really is.
You Don't Have to Live Next to Me: Towards Demobilizing Individualistic Bias in Computational Approaches to Urban Segregation
Computers and Society
Fixes unfair city plans that hurt poor people.
Mapping Socio-Economic Divides with Urban Mobility Data
Physics and Society
Bike trips show where rich and poor people live.