Fair Regression under Demographic Parity: A Unified Framework
By: Yongzhen Feng , Weiwei Wang , Raymond K. W. Wong and more
We propose a unified framework for fair regression tasks formulated as risk minimization problems subject to a demographic parity constraint. Unlike many existing approaches that are limited to specific loss functions or rely on challenging non-convex optimization, our framework is applicable to a broad spectrum of regression tasks. Examples include linear regression with squared loss, binary classification with cross-entropy loss, quantile regression with pinball loss, and robust regression with Huber loss. We derive a novel characterization of the fair risk minimizer, which yields a computationally efficient estimation procedure for general loss functions. Theoretically, we establish the asymptotic consistency of the proposed estimator and derive its convergence rates under mild assumptions. We illustrate the method's versatility through detailed discussions of several common loss functions. Numerical results demonstrate that our approach effectively minimizes risk while satisfying fairness constraints across various regression settings.
Similar Papers
Meta Optimality for Demographic Parity Constrained Regression via Post-Processing
Machine Learning (Stat)
Makes computer predictions fair for everyone.
Penalized Fair Regression for Multiple Groups in Chronic Kidney Disease
Methodology
Helps doctors treat everyone fairly, even with bias.
Decomposing Direct and Indirect Biases in Linear Models under Demographic Parity Constraint
Machine Learning (Stat)
Shows how computer decisions unfairly favor some groups.