FSL-BDP: Federated Survival Learning with Bayesian Differential Privacy for Credit Risk Modeling
By: Sultan Amed, Tanmay Sen, Sayantan Banerjee
Potential Business Impact:
Helps banks predict loan defaults without sharing private data.
Credit risk models are a critical decision-support tool for financial institutions, yet tightening data-protection rules (e.g., GDPR, CCPA) increasingly prohibit cross-border sharing of borrower data, even as these models benefit from cross-institution learning. Traditional default prediction suffers from two limitations: binary classification ignores default timing, treating early defaulters (high loss) equivalently to late defaulters (low loss), and centralized training violates emerging regulatory constraints. We propose a Federated Survival Learning framework with Bayesian Differential Privacy (FSL-BDP) that models time-to-default trajectories without centralizing sensitive data. The framework provides Bayesian (data-dependent) differential privacy (DP) guarantees while enabling institutions to jointly learn risk dynamics. Experiments on three real-world credit datasets (LendingClub, SBA, Bondora) show that federation fundamentally alters the relative effectiveness of privacy mechanisms. While classical DP performs better than Bayesian DP in centralized settings, the latter benefits substantially more from federation (+7.0\% vs +1.4\%), achieving near parity of non-private performance and outperforming classical DP in the majority of participating clients. This ranking reversal yields a key decision-support insight: privacy mechanism selection should be evaluated in the target deployment architecture, rather than centralized benchmarks. These findings provide actionable guidance for practitioners designing privacy-preserving decision support systems in regulated, multi-institutional environments.
Similar Papers
Differentially private federated learning for localized control of infectious disease dynamics
Machine Learning (CS)
Helps predict disease spread without sharing private data.
An Interactive Framework for Implementing Privacy-Preserving Federated Learning: Experiments on Large Language Models
Machine Learning (CS)
Protects private data while training smart computer programs.
Differentially Private Federated Learning With Time-Adaptive Privacy Spending
Machine Learning (CS)
Learns more from private data, faster.