• Federated Learning & Secure MPC

Train together without trusting each other

Train together without trusting each other

Your Legal team will block the central aggregator. Your compliance team will flag the gradient exports. Stoffel removes the aggregator from trust boundaries entirely — nobody sees each other's contribution, including us. No exposure window, nothing to audit, and nothing to subpoena.

Built for teams stuck between innovation and compliance

ML Teams at Multi-Party Organizations

Train on signal you can't centralize

Get model lift without exposing raw updates

Run consortium PoCs your Legal team can actually sign off on

Privacy-Focused Product Teams

Meet privacy requirements structurally, not procedurally

Build features that require distributed training

Run product analytics without a central data store that can be breached or subpoenaed

Infrastructure Teams Managing Consortium Data

Remove the central aggregator from the trust boundary

Run federated workflows without "trusted third party" assumptions

No new data-sharing agreements. No new vendor access to redline

Privacy guarantees you can actually explain in a review meeting

Stop arguing about who to trust with aggregation. The system architecture prevents anyone from seeing individual updates — including us.

No One Sees Individual Updates

Not even the aggregator can read another participant's gradients

Parties jointly compute the aggregate using MPC

Individual updates remain cryptographically hidden throughout

Only the combined model is revealed to participants

Control What Leaves Each Round

Policy enforcement at the protocol level, not the promise level

Only global model and approved metrics are revealed

No raw updates, no plaintext gradients, no CSV exports

Define output policy once; the system enforces it

Keep Your Existing Pipeline

Drop-in integration with Flower, not a framework rewrite

Use your current Python and Flower code

Configure clipping and differential privacy as usual

No data lake merges, no new agents on analyst machines

The full stack. Private by architecture

FedAvg Module in Stoffel Lang

Drop-in replacement for your Flower aggregation strategy with MPC privacy guarantees built in.

Orchestration Layer

Handles round management, timeouts, and partial participation so your training runs don't break when someone drops.

Python/Flower Adapter SDK

Call the Stoffel aggregator exactly like you'd call a standard FedAvg server—same interface, private backend.

Local Development Tools

Test your aggregation logic locally before deploying to multi-party environments.

Output Policy Configuration

Specify which metrics and artifacts can leave each round — everything else stays encrypted. Auditable output policy, definable once, enforced by the system

Documentation & Integration Support

Step-by-step migration guides for existing Flower deployments — and a compliance architecture overview for teams evaluating the security model.

For Engineering Teams

Ship like a normal developer
Privacy happens in the background

We built this for teams who want structural guarantees, not policy promises. You shouldn't have to become a cryptographer to stop accumulating liability.

Familiar Code Patterns

Same Flower interface you're already using—just point to our aggregator endpoint.

Local Simulation

Test your aggregation logic on your machine before running multi-party.

Local Simulation

Test your aggregation logic on your machine before running multi-party.

Clear Error Messages

No cryptic protocol failures—actual debugging information when something goes wrong.

Flexible Privacy Controls

Add differential privacy and gradient clipping where you need it; skip it where you don't.

Flexible Privacy Controls

Add differential privacy and gradient clipping where you need it; skip it where you don't.

Works With Your Stack

Python 3.8+, compatible with standard ML frameworks (PyTorch, TensorFlow, etc.).

Works With Your Stack

Python 3.8+, compatible with standard ML frameworks (PyTorch, TensorFlow, etc.).

Honest About Tradeoffs

MPC adds computation time. We're upfront about performance characteristics so you can decide if the tradeoff works.

Minimal changes

# server.py
import flwr as fl
# drop-in FedAvg replacement
from stoffel_flower.strategy import StoffelFedAvg 

# Instantiate Stoffel’s FedAvg
strategy = StoffelFedAvg(
    # Standard FedAvg parameters (unchanged)
    fraction_fit=0.5,
    min_fit_clients=2,
    min_available_clients=2,
    on_fit_config_fn=fit_config,

fl.server.start_server(
    server_address="0.0.0.0:8080",
    config=fl.server.ServerConfig(num_rounds=5),
    strategy=strategy,
)

Minimal changes

# server.py
import flwr as fl
# drop-in FedAvg replacement
from stoffel_flower.strategy import StoffelFedAvg 

# Instantiate Stoffel’s FedAvg
strategy = StoffelFedAvg(
    # Standard FedAvg parameters (unchanged)
    fraction_fit=0.5,
    min_fit_clients=2,
    min_available_clients=2,
    on_fit_config_fn=fit_config,

fl.server.start_server(
    server_address="0.0.0.0:8080",
    config=fl.server.ServerConfig(num_rounds=5),
    strategy=strategy,
)

Three steps. No data movement

  1. Local Training

Each party trains on their local data as usual. Nothing leaves their environment at this stage.

  1. Private Aggregation

The computation happens jointly. No party can read another's contribution — during and after aggregation. Architectural guarantee, not a policy promise.

  1. Distribution

Everyone receives the new global model. That's all they learn—the aggregate result, no individual contributions.

FAQ

Have more questions? Contact our team with any questions you may have.

Does any raw data move between parties?

No. Each party trains locally. Only the aggregate model is computed and shared. Individual updates never leave in plaintext.

How do you handle stragglers or parties dropping out?

Built-in support for timeouts and partial participation. If someone doesn't respond, the round continues with available parties.

Can I use differential privacy and gradient clipping?out Stoffel

Yes. Hook points exist in the protocol for both. Configure them like you would in standard Flower.

What's the performance impact compared to plaintext aggregation?

MPC adds computation overhead. Exact impact depends on number of parties and network conditions. We recommend piloting with your actual setup to evaluate the tradeoff.

© 2025 Stoffel Labs Inc. All rights reserved.

© 2025 Stoffel Labs Inc. All rights reserved.

© 2025 Stoffel Labs Inc. All rights reserved.