Recovering Fairness Directly from Modularity: a New Way for Fair Community Partitioning
By: Yufeng Wang , Yiguang Bai , Tianqing Zhu and more
Potential Business Impact:
Makes groups fair when dividing people.
Community partitioning is crucial in network analysis, with modularity optimization being the prevailing technique. However, traditional modularity-based methods often overlook fairness, a critical aspect in real-world applications. To address this, we introduce protected group networks and propose a novel fairness-modularity metric. This metric extends traditional modularity by explicitly incorporating fairness, and we prove that minimizing it yields naturally fair partitions for protected groups while maintaining theoretical soundness. We develop a general optimization framework for fairness partitioning and design the efficient Fair Fast Newman (FairFN) algorithm, enhancing the Fast Newman (FN) method to optimize both modularity and fairness. Experiments show FairFN achieves significantly improved fairness and high-quality partitions compared to state-of-the-art methods, especially on unbalanced datasets.
Similar Papers
Quantifying Group Fairness in Community Detection
Social and Information Networks
Finds unfairness in group networks, helps fix it.
MOUFLON: Multi-group Modularity-based Fairness-aware Community Detection
Social and Information Networks
Finds fair groups in online friend networks.
A Deep Latent Factor Graph Clustering with Fairness-Utility Trade-off Perspective
Machine Learning (CS)
Divides groups fairly and accurately.