Continuous Fairness On Data Streams
By: Subhodeep Ghosh , Zhihui Du , Angela Bonifati and more
Potential Business Impact:
Makes computer decisions fair for everyone, all the time.
We study the problem of enforcing continuous group fairness over windows in data streams. We propose a novel fairness model that ensures group fairness at a finer granularity level (referred to as block) within each sliding window. This formulation is particularly useful when the window size is large, making it desirable to enforce fairness at a finer granularity. Within this framework, we address two key challenges: efficiently monitoring whether each sliding window satisfies block-level group fairness, and reordering the current window as effectively as possible when fairness is violated. To enable real-time monitoring, we design sketch-based data structures that maintain attribute distributions with minimal overhead. We also develop optimal, efficient algorithms for the reordering task, supported by rigorous theoretical guarantees. Our evaluation on four real-world streaming scenarios demonstrates the practical effectiveness of our approach. We achieve millisecond-level processing and a throughput of approximately 30,000 queries per second on average, depending on system parameters. The stream reordering algorithm improves block-level group fairness by up to 95% in certain cases, and by 50-60% on average across datasets. A qualitative study further highlights the advantages of block-level fairness compared to window-level fairness.
Similar Papers
Stream-Based Monitoring of Algorithmic Fairness
Machine Learning (CS)
Checks if computer decisions are fair to everyone.
Fair Clustering in the Sliding Window Model
Data Structures and Algorithms
Makes computer groups fairer, even with changing data.
On the use of graph models to achieve individual and group fairness
Machine Learning (Stat)
Makes computer decisions fair for everyone.