On the use of graph models to achieve individual and group fairness
By: Arturo Pérez-Peralta, Sandra Benítez-Peña, Rosa E. Lillo
Machine Learning algorithms are ubiquitous in key decision-making contexts such as justice, healthcare and finance, which has spawned a great demand for fairness in these procedures. However, the theoretical properties of such models in relation with fairness are still poorly understood, and the intuition behind the relationship between group and individual fairness is still lacking. In this paper, we provide a theoretical framework based on Sheaf Diffusion to leverage tools based on dynamical systems and homology to model fairness. Concretely, the proposed method projects input data into a bias-free space that encodes fairness constrains, resulting in fair solutions. Furthermore, we present a collection of network topologies handling different fairness metrics, leading to a unified method capable of dealing with both individual and group bias. The resulting models have a layer of interpretability in the form of closed-form expressions for their SHAP values, consolidating their place in the responsible Artificial Intelligence landscape. Finally, these intuitions are tested on a simulation study and standard fairness benchmarks, where the proposed methods achieve satisfactory results. More concretely, the paper showcases the performance of the proposed models in terms of accuracy and fairness, studying available trade-offs on the Pareto frontier, checking the effects of changing the different hyper-parameters, and delving into the interpretation of its outputs.
Similar Papers
On the Robustness of Fairness Practices: A Causal Framework for Systematic Evaluation
Software Engineering
Makes computer decisions fair for everyone.
Fairness-Aware Graph Representation Learning with Limited Demographic Information
Machine Learning (CS)
Makes AI fairer even with secret data.
Fairness-Aware Graph Representation Learning with Limited Demographic Information
Machine Learning (CS)
Makes AI fairer even with secret data.