As threats and risks associated with data identification intensify, traditional assurances like "we only store anonymous data" and "no data sharing"-policies are no longer sufficient to guarantee true security.
With increasing awareness of these security challenges and the potential exposure of sensitive information, users now expect companies to uphold far higher standards of data protection than ever before.
To meet these expectations, companies must go beyond basic promises. Privacy needs to be embedded at the core of their operations, supported by proactive, robust systems that protect user data. But how can companies effectively achieve this?
Last week's news highlights that many period-tracking apps are pledging not to disclose sensitive data to U.S. authorities. While some companies have stated that they won’t comply with data requests or that their users’ data is anonymous, the reality is complex.
Truly anonymous in data is rare, as technological advances in data analysis and de-anonymization increasingly make it possible to re-identify individuals, even from seemingly “anonymous” data sets. This raises critical questions about the security and actual privacy of users’ health data.
Balancing this is especially critical given recent developments in data privacy concerns, particularly in relation to period-tracking applications. With debates around reproductive rights and abortion laws intensifying in the U.S., concerns have risen over whether personal health data from these apps might one day be used against users, especially in states with restrictive abortion laws.
Given these risks, promising alone that data won’t be shared - or assuming it’s anonymous - is most likely not going to be enough.
The rise of advanced cyberattacks has highlighted the inadequacy of relying solely on standard anonymity as a protective measure, even when adhering to contemporary best practices. Consequently, organizations handling sensitive data must prioritize the implementation of robust cryptographic tools as a critical component in their defense strategy.
These growing capabilities put data, even that which is anonymized, at potential risk of exposure.
This is where Privacy Enhancing Technology (P.E.T) stands out.
However, as dedicated as we are to data innovation, we are equally strong advocates for privacy. In today’s world, where data privacy and ethical data usage are urgent concerns, we believe that creating value from data should not come at the expense of individual confidentiality. The challenge lies in creating a system where data can serve the public good without becoming a liability for individuals.
We believe in a world where companies can confidently say, “We won't share your data because we don’t have your data.”
With Multi-Party Computation (MPC) technology, we can make it possible for period-tracking companies, healthcare platforms, and any app handling sensitive information to operate without storing user data in a way that could later be accessed by a single party or easily subpoenaed.
MPC allows multiple parties to collaborate on a compute on data without any single party having access to the others' data. This approach ensures that data insights can be used, but no single entity ever holds the full dataset.
Here’s how it works: Instead of storing sensitive user data in a single, vulnerable location, MPC divides the data into “encrypted” segments (commonly referred to as secret-shares) distributed across a network run by different entities. This ensures that no central entity has access to the complete dataset, making it much harder for hackers or unauthorized parties to reassemble it. As a result, apps using MPC technology can assure users that there is no data readily available to hand over at a single location, even under subpoena.
This level of privacy protection, combined with the benefits of data analytics, represents a revolutionary advancement in data security.
By pioneering MPC technology, we empower companies to serve society with data-driven insights without compromising individual privacy. For period-tracking apps and similar tools, this means they can deliver valuable health information to users, contribute to scientific insights, and support public health efforts - all without putting personal privacy at risk.
Our goal is to turn data into a resource that serves people and improves society while ensuring that no one is ever exposed to harm or legal repercussions because of their own personal health data.
In a time of heightened privacy concerns, technological solutions like MPC provide a way forward. We believe in a future where data is a powerful resource that belongs to everyone and endangers no one, allowing society to benefit from innovation without sacrificing individual rights. With the right technology, we can make this vision a reality.
A simplistic view of how how apps works today:
Most applications that collect and analyze historical user data operate by having users input data into the company’s servers, which then process and return analysis based on an ever-growing dataset.
How it works using the Partisia Platform:
Using MPC, we can split the data entered by the user into multiple pieces that can be stored at individual data nodes. This ensures that the data remains fully encrypted and meaningless unless all parts are combined. However, if all nodes collaborate they can still compute on the data and deliver valuable insights to the user without having to share the data among them. As a result, even if a government in one country demands access to the servers, they will not be able to retrieve any information about the entire dataset.
This approach guarantees that the data stays securely encrypted.
Alternatively, Fully Homomorphic Encryption (FHE) can be used instead of MPC to ensure that only encrypted data - accessible only to the user - ever leaves the user’s phone.
Here's how it works:
1. The user encrypts their plaintext data using a key known only to them.
2. This encrypted data is sent to a server.
3. The server processes the encrypted data using FHE, performing computations directly on it without ever decrypting it.
4. The server returns the encrypted result to the user.
5. The user decrypts the result using their private key, allowing them to read it.
This approach ensures that the server can neither access the user's raw data nor the computed results at any point.