It is well known(1) that most of the mechanisms currently protecting our digital lives will be broken when quantum computers become powerful enough.
The quantum computer threat is not just looming in the near(2) future: Nothing is stopping motivated attackers from storing sensitive encrypted data for later decryption—the so-called “store now, decrypt later” attack. This means we have to seriously think about switching to quantum-proof technologies sooner rather than later. A similar issue exists for digital signatures that have a long life span. For example, attestations for hardware with a long lifespan, or digitally signed documents that are valid for decades (such as contracts or deeds).
Two quantum-proof techniques that are often mentioned are “post quantum cryptography”, or PQC, and “quantum key distribution”, or QKD. We will try to untangle these two concepts here.
PQC is the broad term given to cryptographic algorithms that are based on computational problems that are believed to be hard to solve on a quantum computer. That is, just like the widely used RSA algorithm is based on the difficulty of factoring large integers, a PQC algorithm might be based on the difficulty of finding a “small” solution to an erroneous system of linear equations.
QKD is the term given to a class of key agreement protocols that utilize quantum mechanics and specialized hardware to obtain security.
There are big differences between PQC and QKD which need to be taken into account when choosing a quantum-proof technology. How easy is it to deploy? What kind of functionality does it provide? How mature is the technology? How secure is it? And so on.
Deployment
PQC is meant, at its core, as a drop-in replacement for current digital signature and public-key encryption algorithms, and although some technical difficulties exist, deployment is fairly straightforward. In fact, PQC has already been rolled out by many major software companies(3). There is also a significant and structured effort to standardize a suite of PQC algorithms by the American NIST(4).
QKD has novel hardware requirements that significantly hinders deployment as the existing infrastructure to large extent cannot be used. Even more problematic is that QKD has a fairly limited range, and the technology required to amplify quantum signals does not yet exist. Without these amplifiers, or “quantum repeaters”, any QKD deployment must use trusted relays every 100 km or so. The relays must be trusted, as the transported data appear in the clear inside every relay. This severely complicates the security of the system as a whole. For instance, virtual private networks (VPNs) are widely deployed, as they allow a distributed organization to maintain full control over its data. A secure VPN would be impossible to implement in a setting with multiple relays owned by third parties. Finally, QKD cannot function securely without an authentication mechanism, and since digital signatures cannot(5) be realized using quantum technology, a crucial component of any QKD component is a classical communication infrastructure that provides authentication of participants.
Functionality
PQC, being more of a design philosophy rather than a particular type of algorithm, has a wide range of functionality. However, most research so far has focused on developing PQC algorithms for digital signatures and public-key encryption.
QKD deals strictly with key agreement between two parties, and in fact requires that QKD parties already have an authentication infrastructure in place. Indeed, QKD cannot provide authentication of data, in particular digital signatures cannot be built from QKD or any quantum technology.
Maturity
PQC and QKD are both fairly mature technologies: The first QKD protocol is from the late 80s, and one of the current PQC NIST candidates is based on a public-key encryption algorithm from the late 70s.
However, PQC has received much more scrutiny in terms of both theoretical and practical analysis. Because PQC runs on current computers and in the current digital infrastructure, we have a good understanding of the different types of threats that exist. This is not so for QKD because of the novel hardware requirements.
Security
Both PQC and QKD differ significantly through the means by which they attain their security. QKD enjoys information theoretic security, which essentially means that an encryption key created by a QKD protocol will remain secure regardless of how quantum computing develops. PQC, on the other hand, is computational secure. This means that the security of a particular PQC algorithm is based on the difficulty of a particular mathematical problem.
At the surface, this might seem like PQC is less secure than QKD. However, it is important to remember that security is a complicated and multifaceted concept. As an example, while QKD enjoys good theoretical security, multiple attacks have already been demonstrated against deployments, such as Photon Number Splitting and Trojan Horse attacks. When judging security, it is therefore important to judge the system as a whole, and here PQC has a clear advantage over QKD.
Quantum-proofing our data
It is clear from the above that QKD, at present, is not mature enough to quantum-proof our digital infrastructure. Store now, decrypt later attacks are an issue now, and so we cannot rely on a technology that still requires significant development before it can be deployed widely to help us. Further, QKD cannot provide digital signatures, and these are just as important for a secure infrastructure as encryption.
PQC on the other hand is based on well studied design philosophies and technologies that are, in some cases, as old as the technologies it seeks to replace(6). This conclusion also concurs with multiple authorities in the cyber-security space(7).
What about QKD then?
We shouldn’t conclude from the above that QKD is simply not worth it. Far from it. QKD provides an excellent method for key agreement in scenarios where a high degree of control over the deployment is possible, and where a high degree of security is required. This could for example be on internal networks in a military or government, or as for usage on an inter-banking network. Common for these scenarios is that they operate in a fairly homogeneous setting: For example, all participants are known (making authentication simpler), the infrastructure is simpler and under control of the organization (making the need for trusted relays easier to deal with).
_____________________________________
References:
1. https://en.wikipedia.org/wiki/Shor's_algorithm
2. 10-20 years, according to the Global Risk Institute.
3. Google, amazon, Apple iMessage, Cloudflare.
4. https://csrc.nist.gov/Projects/post-quantum-cryptography/selected-algorithms-2022
5. This is not a matter of lack of development—it has been shown that an algorithm with the properties required by a digital signature scheme cannotbe realized using technology based on quantum information.
6. The Classic McEliece cryptosystem, for example, was described in 1978, just one year after the widely used RSA cryptosystem was published.