Differential Privacy-Driven Framework for Enhancing Heart Disease Prediction
By: Yazan Otoum, Amiya Nayak
Potential Business Impact:
Keeps patient health secrets safe while finding diseases.
With the rapid digitalization of healthcare systems, there has been a substantial increase in the generation and sharing of private health data. Safeguarding patient information is essential for maintaining consumer trust and ensuring compliance with legal data protection regulations. Machine learning is critical in healthcare, supporting personalized treatment, early disease detection, predictive analytics, image interpretation, drug discovery, efficient operations, and patient monitoring. It enhances decision-making, accelerates research, reduces errors, and improves patient outcomes. In this paper, we utilize machine learning methodologies, including differential privacy and federated learning, to develop privacy-preserving models that enable healthcare stakeholders to extract insights without compromising individual privacy. Differential privacy introduces noise to data to guarantee statistical privacy, while federated learning enables collaborative model training across decentralized datasets. We explore applying these technologies to Heart Disease Data, demonstrating how they preserve privacy while delivering valuable insights and comprehensive analysis. Our results show that using a federated learning model with differential privacy achieved a test accuracy of 85%, ensuring patient data remained secure and private throughout the process.
Similar Papers
Differential Privacy for Deep Learning in Medicine
Machine Learning (CS)
Keeps patient data safe while training AI.
Differential Privacy for Secure Machine Learning in Healthcare IoT-Cloud Systems
Cryptography and Security
Faster, private medical help using smart devices.
Personalized Federated Training of Diffusion Models with Privacy Guarantees
Machine Learning (CS)
Creates private, fair data for AI.