Score: 2

Breaking the Dyadic Barrier: Rethinking Fairness in Link Prediction Beyond Demographic Parity

Published: November 9, 2025 | arXiv ID: 2511.06568v2

By: João Mattos, Debolina Halder Lina, Arlei Silva

Potential Business Impact:

Makes online suggestions fairer for everyone.

Business Areas:
Predictive Analytics Artificial Intelligence, Data and Analytics, Software

Link prediction is a fundamental task in graph machine learning with applications, ranging from social recommendation to knowledge graph completion. Fairness in this setting is critical, as biased predictions can exacerbate societal inequalities. Prior work adopts a dyadic definition of fairness, enforcing fairness through demographic parity between intra-group and inter-group link predictions. However, we show that this dyadic framing can obscure underlying disparities across subgroups, allowing systemic biases to go undetected. Moreover, we argue that demographic parity does not meet desired properties for fairness assessment in ranking-based tasks such as link prediction. We formalize the limitations of existing fairness evaluations and propose a framework that enables a more expressive assessment. Additionally, we propose a lightweight post-processing method combined with decoupled link predictors that effectively mitigates bias and achieves state-of-the-art fairness-utility trade-offs.

Country of Origin
🇺🇸 United States

Repos / Data Links

Page Count
14 pages

Category
Computer Science:
Machine Learning (CS)