Score: 0

FedPoP: Federated Learning Meets Proof of Participation

Published: November 11, 2025 | arXiv ID: 2511.08207v1

By: Devriş İşler , Elina van Kempen , Seoyeon Hwang and more

Potential Business Impact:

Proves you helped train a computer model privately.

Business Areas:
Peer to Peer Collaboration

Federated learning (FL) offers privacy preserving, distributed machine learning, allowing clients to contribute to a global model without revealing their local data. As models increasingly serve as monetizable digital assets, the ability to prove participation in their training becomes essential for establishing ownership. In this paper, we address this emerging need by introducing FedPoP, a novel FL framework that allows nonlinkable proof of participation while preserving client anonymity and privacy without requiring either extensive computations or a public ledger. FedPoP is designed to seamlessly integrate with existing secure aggregation protocols to ensure compatibility with real-world FL deployments. We provide a proof of concept implementation and an empirical evaluation under realistic client dropouts. In our prototype, FedPoP introduces 0.97 seconds of per-round overhead atop securely aggregated FL and enables a client to prove its participation/contribution to a model held by a third party in 0.0612 seconds. These results indicate FedPoP is practical for real-world deployments that require auditable participation without sacrificing privacy.

Country of Origin
🇺🇸 United States

Page Count
16 pages

Category
Computer Science:
Cryptography and Security