The great data paradox: How to innovate with AI while keeping secrets safe

The great data paradox: How to innovate with AI while keeping secrets safe
The great data paradox: How to innovate with AI while keeping secrets safe

By Sofie Krabbe,

MPC, AI, Frederated Learning

In today's world, where almost everything is driven by data, Artificial Intelligence (AI) is the engine of progress. From revolutionizing healthcare to powering smart cities, AI demands vast amounts of data to learn and evolve. Yet, this hunger for data often clashes with an equally critical demand: privacy.

But how do we unlock AI's potential without compromising sensitive information? 

This is the great data paradox many organizations face.

Federated Learning's promise on training AI where data lives

Ever heard of the promising approach called Federated Learning? 

Federated Learning offers an elegant solution to this dilemma. Imagine wanting to train an AI model using data spread across many different locations, perhaps across various hospitals, or on millions of individual smartphones. The traditional approach would centralize all that data, but for highly sensitive information or when strict regulations apply, this just isn't an option.

Federated Learning flips this idea on its head. Instead of moving all the raw data, FL brings the learning to the data. Here's how it works: Each data owner securely trains an AI model using only their local information. Then, instead of sending the raw data, they send only the model obtained by training on their local data. If unfamiliar with the approach this can be thought of as small “summaries” of their local data. 

These "summaries"/models are then combined by a central coordinator to create a powerful global model. 

This approach aligns with privacy principles by minimizing data sharing, enhances accountability, and reduces the impact of large-scale data breaches since raw data never leaves its source.

no alt text

The unspoken challenge and why Federated Learning alone isn't enough

While Federated Learning is a brilliant step forward, it's not a silver bullet for all privacy challenges. As the European Data Protection Supervisor (EDPS) recently highlighted, Federated Learning models and their updates aren't inherently anonymous. Even without direct access to raw data, an attacker could potentially infer sensitive information by carefully analyzing the model updates or the final AI model itself. 

This opens the door to tricky "membership inference attacks," where adversaries could determine if specific individuals' data was part of the training set.

Furthermore, the strength of a Federated Learning system is only as strong as its weakest link. If security isn't watertight across every participating device or entity, attackers can find vulnerabilities that compromise the entire system. 

Simply put, Federated Learning by itself isn't enough to protect truly sensitive/confidential data.

Elevating federated learning with unbreakable guarantees: Partisia's breakthrough:

This is precisely where Partisia's unique expertise comes into play.

As a cryptographer, I see the tension between AI's data hunger and the non-negotiable need for privacy as an opportunity. Our work is to bridge that gap securely. Federated learning can't stand alone as it may be possible to infer individual data owners' data from the local models. To truly protect sensitive data it is necessary to utilize advanced cryptography such as MPC
Søren Eller Thomsen Cryptographic Engineer, Partisia
no alt text

And the EDPS report points directly to solutions like Multi-Party Computation (MPC) as essential tools to mitigate these inherent threats in Federated Learning.

At Partisia, we supercharge Federated Learning by integrating it with Secure Multi-Party Computation (MPC).

Imagine that central coordinator whose job it is to "combine" all those local model updates.

With Partisia’s technology, this aggregation process happens within an unbreakable, decentralized, and encrypted environment powered by MPC.

This means:

  • Absolute confidentiality: There is no central coordinator and therefore no individual  ever learns anything about  the model of other individuals, only the securely combined result.

  • Enhanced security: MPC eliminates the "weakest link" vulnerability during the critical aggregation phase, making data leakage from individual model updates virtually impossible.

For truly sensitive data, simply distributing the learning isn't enough. We apply advanced cryptography, like MPC, to ensure that no individual ever learns anything about others' data - aside from the final result
Søren Eller Thomsen Cryptographic Engineer, Partisia

This powerful combination of Federated Learning with Partisia's MPC capabilities makes it possible to train large-scale robust AI models using highly sensitive, distributed data, with true, provable privacy guarantees.

Secure AI in action, across industries, creating real-world impact

Partisia's Federated Learning has moved beyond theory, actively transforming industries today:

  • Fighting financial fraud smarter: Financial institutions can collaboratively detect complex fraud patterns by sharing insights from their data, without ever exposing customer account details or proprietary transaction histories.

  • Collaborative R&D among competitors: Companies in competitive sectors can now pool valuable, sensitive datasets to train advanced AI models for research and development - whether it's optimizing wind turbine performance or improving autonomous driving capabilities - all while ensuring each company's proprietary data remains private.

From a cryptographic perspective, innovation doesn't demand compromise on privacy. Both can thrive when rooted in provable security. That's the promise of combining Federated Learning with MPC
Søren Eller Thomsen Cryptographic Engineer, Partisia

At Partisia, we're making secure, privacy-preserving AI a tangible reality. We believe that innovation doesn't demand compromise; it thrives on trust. With our technology, you can unlock the full potential of your data and drive groundbreaking advancements, confident that your privacy is absolutely guaranteed.

Reach out to Søren to hear more

Søren Eller Thomsen

Søren Eller Thomsen

Cryptographic Engineer

soren.eller.thomsen@partisia.com

Who is Partisia?

We are an innovative software company and a trusted partner empowering companies to compute on encrypted data. Providing a platform where data from individuals, governments and private companies are able to stay encrypted and protected, and still fully enabled. Partisia is founded by pioneers within Multiparty Computation and advanced cryptography.