Device 1Device 2Device 3Server

Federated Learning

with zk-SNARKs

Device 1Device 2Device 3Local DataLocal DataLocal DataData remains on individual devices

Step 1: Data Distribution

Data stays on multiple devices or servers. Each participant has their own local dataset, ensuring privacy and reducing centralized storage needs.

Device 1Device 2Device 3Local model training on each device

Step 2: Local Model Training

Each participant trains a machine learning model on their local dataset. This keeps raw data on the device, maintaining privacy.

Device 1Device 2Device 3zk-SNARKProofzk-SNARKProofzk-SNARKProofGenerate zk-SNARK proofs for model updates

Step 3: zk-SNARK Proof Generation

After local training, each participant creates a zk-SNARK proof. This shows they've correctly updated their model without revealing the actual data or updates.

Central ServerUpdate 1Update 2Update 3Server aggregates and verifies proofs

Step 4: Aggregation and Verification

The central server collects model updates and zk-SNARK proofs. It verifies the proofs to ensure update integrity without learning private information about local datasets.

Central ServerGlobal ModelDevice 1Device 2Device 3Updated global model distributed to devices

Step 5: Global Model Update

After verifying all proofs, the central server combines the model updates to create a new global model. This updated model is then shared with all participants.