Federated Learning
with zk-SNARKs
Step 1: Data Distribution
Data stays on multiple devices or servers. Each participant has their own local dataset, ensuring privacy and reducing centralized storage needs.
Step 2: Local Model Training
Each participant trains a machine learning model on their local dataset. This keeps raw data on the device, maintaining privacy.
Step 3: zk-SNARK Proof Generation
After local training, each participant creates a zk-SNARK proof. This shows they've correctly updated their model without revealing the actual data or updates.
Step 4: Aggregation and Verification
The central server collects model updates and zk-SNARK proofs. It verifies the proofs to ensure update integrity without learning private information about local datasets.
Step 5: Global Model Update
After verifying all proofs, the central server combines the model updates to create a new global model. This updated model is then shared with all participants.