• Browse topics
Login

Post-quantum cryptography

How to protect sensitive information in the age of quantum computers!

~20mins estimated

General

Post-quantum cryptography: the basics

What is post-quantum cryptography?

Post-quantum cryptography (PQC) refers to a new generation of cryptographic algorithms that are designed to be secure against attacks from both today's classical computers and the powerful quantum computers of the future. Our modern digital security is built on encryption standards like RSA and ECC. These standards are secure today because they rely on mathematical problems (such as factoring enormous numbers) that are practically impossible for classical computers to solve within a reasonable amount of time.

However, a sufficiently powerful quantum computer, once one is built, will be able to solve these specific mathematical problems with alarming speed using algorithms like Shor's algorithm. This would instantly break most of the encryption we rely on, rendering our secure communications, digital signatures, and protected data completely vulnerable. PQC algorithms are being developed to counter this threat. They are based on different mathematical problems that are believed to be difficult for both classical and quantum computers to solve, ensuring our data remains safe in the "quantum era."

About this lesson

In this lesson, we'll explore the emerging threat that quantum computing poses to our current cryptographic standards. We will cover why this is such a significant problem, see a conceptual example of how a quantum attack could unfold, and learn about the new families of post-quantum algorithms being standardized to build a more secure, future-proof digital world.

FUN FACT

Quantum origins...

The "quantum" in quantum computing comes from quantum mechanics, the branch of physics that describes the strange behavior of subatomic particles. Quantum computers harness these properties, like superposition (where a quantum bit, or "qubit," can be a 0 and 1 at the same time) and entanglement (where qubits are mysteriously linked), to perform their powerful calculations.

Importance and background of PCQ

Symmetric vs. asymmetric!

Firstly, it’s important to provide some background before we tell you about the different post-quantum cryptographic algorithms that are now approved by NIST – and in which cases to use them.

Symmetric cryptography, which uses a single shared secret key for both encryption and decryption, tends to be more robust and therefore sees less impact from the abilities of quantum computers. What makes symmetric encryption more robust is its mult-directional difficulty. The basis of asymmetric encryption is that one direction is simple, but the other is wildly complicated, whereas symmetric encryption uses the same process to both encrypt and decrypt the data. Therefore, the asymmetric algorithms are in more trouble – largely because of Shor’s algorithm.

Shor’s algorithm (developed in 1994) can crack traditional asymmetric encryption (for example, it can find the prime factors of a large integer or solve the discrete logarithm problem much more efficiently than before). This puts encryption methods like RSA, DSA, ECC, Diffie-Hellman, El Gamal, etc. at risk. Using Shor’s algorithm right now, it would still take modern computers possibly thousands of years to crack the encryption, but quantum computers using it could potentially crack them in minutes or hours.

Still, symmetric encryption is not completely safe. Grover’s algorithm (developed in 1996) can also be used by quantum computers to crack these traditionally sound cryptographic methods. It is used to efficiently solve unstructured search algorithms, can break symmetric key encryption (AES, ARIA, DES3), and can find the input that produced a specific hash output (SHS).

Which current algorithms are most vulnerable?

The diagram below shows which traditional algorithms are most severely compromised (circled) and what they are used for currently.

PQC diagram

Which algorithms should you be using and when?

There are now three main algorithms approved by NIST, one for key exchange and two for certificates and digital signatures. These methods are designed to be resistant to quantum-based attacks, and can be used in conjunction with the existing, less vulnerable algorithms (the ones not circled in the diagram above).

For post-quantum key exchange (confidentiality), use Module-Lattice-Based Key-Encapsulation Mechanism (ML-KEM).

For post-quantum certificates/digital signatures (authentication & integrity), use the Module-Lattice-Based Digital Signature Algorithm (ML-DSA) or the Stateless Hash-Based Digital Signature Algorithm (SLH-DSA). These are digital signature algorithms to replace RSA and elliptic curve technology.

We will go into more detail about the specifics of these algorithms in the ‘Mitigation’ section of this lesson, but for now, check out an example of the first of these algorithms in action!

PQC in action

Suppose that Whitfield Diffie and Martin Hellman are trying to send a secure message to each other across a public channel. Without proper encryption, it might go something like this:

A message without quantum-resistant encryption

  • STEP 1
  • STEP 2
  • STEP 3

Sending the message!

First, Martin types out his message to Whitfield and uses Diffie-Hellman encryption. Then he sends it over a public channel for Whitfield to access.

Sending the message!

As you can see, without using the new encryption standards, a bad actor with a quantum computer can make quick work of asymmetric encryption methods typically used to protect messages in transit. With the message decoded, this intruder has Martin and Whitfield's password and can access all of their sensitive information!

Now, instead we will implement ML-KEM to prevent an intruder from being able to use the message like they did before!!

Martin (the encapsulator/sender) is sending a secret message to Whitfield (the key generator and receiver). The shared secret key that they will need starts with Whitfield generating a decapsulation key and an encapsulation key. He keeps the decapsulation key private and makes the encapsulation key available to Martin. Martin then uses Whitfield’s encapsulation key to generate one copy of a shared secret key along with an associated ciphertext. Martin then sends the ciphertext to Whitfield. Finally, Whitfield uses the ciphertext from Martin along with his private decapsulation key to compute another copy of the shared secret key.

Here's a diagram of how it works: ML-KEM diagram

And here is what the backend code might look like:

Scan your code & stay secure with Snyk - for FREE!

Did you know you can use Snyk for free to verify that your code
doesn't include this or other vulnerabilities?

Scan your code

Post-quantum mitigation tactics

The NIST algorithms

The first algorithm, ML-KEM, is applied in the case of sending encrypted messages over insecure channels. KEM stands for key-encapsulation mechanism, and uses a set of procedures to establish a secret key that can later be used with symmetric-key cryptographic algorithms for secure communication (like encryption and authorization). The reason this algorithm is so much more robust than others, and therefore more quantum-resistent, is due to the difficulty of the Module Learning with Errors problem, which is a component of the math in the backend of the algorithm.

A cryptographic key itself is composed of a string of bits within a computer – a shared one is computed jointly using a set of algorithms by separate parties, and under the correct conditions will allow both parties to have the same key securely. According to NIST, a key-encapsulation mechanism can be implemented in software, firmware, hardware, or any combination of the three.

Both of the other algorithms, ML-DSA and SLH-DSA, involve digital signatures and authentication on data in transit or at rest. Digital signatures function as a stamp to detect who is sending and receiving information, if the information has been tampered with in any way, and to authenticate the parties on the sending and receiving end. NIST calls this non-repidiation since the signatory cannot deny the validity of their signature at a later time.

ML-DSA is a set of lattice-based algorithms for applications that require a digital signature, and are meant to generate and verify the signatures just described above. SLH-DSA functions similarly but uses a stateless hash-based algorithm based on SPHINCS+, the original submission to NIST before they standardized it to SLH-DSA.

Now that you know more about what these algorithms do and how they work, let’s see what an organization can do to prepare for this cryptographic revolution.

Tips for Organizations as you transition to PQC

  1. Organize and understand what parts of your network require and are using cryptography. The first step to ensuring that your information is secure is recognizing where it needs to be protected. For example, source code, software libraries, packages, and service endpoints.
  2. This is going to be a multi-year migration process, so plan ahead!
  3. Prioritize the changes based on internal risks – start with services that handle the sensitive data to prevent ‘harvest now, decrypt later’ threats. You can focus on things like authorization later on since they are not as vulnerable until quantum computers become publicly accessible.
  4. Upgrade to TLS 1.3 (Transit Layer Security protocol 1.3) since it is a prerequisite for post-quantum cryptographic methods. These are standards to protect the data in transit, such as internet communications.
  5. Create reusable processes that will stand the test of time. Do not focus on patching up problems with a short-term mindset. Instead, you should be aiming to resolve problems with future complications in mind, and trying to create systems that will be robust against technological advancements and require as little maintenance as possible.

Final thoughts

The transition to post-quantum cryptography is likely going to be a long and arduous journey, but that makes it even more important to start the transition as early as possible. Fortunately, there are now bonafide post-quantum algorithms approved and researched that can be implemented in practice. See these links – 203, 204, and 205 – for more in-depth methodological analysis regarding FIPS 203-205 (the NIST algorithms discussed above).

There are also customer responsibilities that the companies you rely on will not be able to do for you as a user. So stay vigilant! For example, keeping your accounts up to date so they are compatible with post-quantum cryptographic methods as they roll out. Make sure you stay in touch with the most recent guidance and continue to learn more about quantum computing as it emerges into the mainstream!

Quiz

Test your knowledge!

Quiz

Which of the following correctly describes the decapsulation key from the diagram/example?

Keep learning

Continue your cybersecurity expertise by checking out some of our other lessons:

  • Here is a lesson about hardcoded secrets if you want to continue the secrets journey!
  • And this lesson discusses authentication and authorization, which are also important to consider with sensitive information being shared.

Congratulations

Thank you for completing this post-quantum cryptography lesson! Now you are prepared to protect our data even in the age of quantum computers.