Dopamine: Differentially Private Federated Learning on Medical Data (AAAI - PPAI)
-
Updated
Feb 9, 2025 - Python
Dopamine: Differentially Private Federated Learning on Medical Data (AAAI - PPAI)
Easy-to-use utilities to build privacy-preserving AI.
Securing Collaborative Medical AI by Using Differential Privacy
Code for the paper "PUFFLE: Balancing Privacy, Utility, and Fairness in Federated Learning" by L. Corbucci, M. A. Heikkilä, D.S. Noguero, A. Monreale, N. Kourtellis.
Hands-on part of the Federated Learning and Privacy-Preserving ML tutorial given at VISUM 2022
Building an AI model for chest X-ray under patient privacy guarantees
A differentially private spiking neural network with temporal enhanced pooling
A Comparative Study of Gradient Clipping Techniques in Differentially Private Stochastic Gradient Descent (DP-SGD)
In this project we add differential privacy into an openset recognizer.to implement DP we use opacus library.
Implementation of my research on automating the optimal privacy budget (ε) in DP-FL using epsilon-aware strategy.
A framework for experimenting with privacy-preserving mechanisms in federated learning. This toolkit enables comparison between local training, standard federated learning, feature suppression, and differential privacy approaches. Includes tools for data preparation, model training, result visualization, and privacy-utility tradeoff analysis.
Intrusion Detection with Differential Privacy using Opacus on the UNSW-NB15 dataset
Implementation of Opacus DP in Flower.
Implement Differentially-private SGD. Large ε cause low model accuracy.
Colosseum-based O-RAN slice resource allocation via Federated Learning & Differential Privacy (Opacus). ClusteredFL · FedProx · DQN · PyTorch
Add a description, image, and links to the opacus topic page so that developers can more easily learn about it.
To associate your repository with the opacus topic, visit your repo's landing page and select "manage topics."