Where Is the Research on Cryptographic Transition and Agility
Mirror: this blog post at COSIC.
Cryptographic agility has been an underrated subject in academia. Cryptoanalysis is getting more sophisticated every single day. Whenever a flaw is discovered, one might say, we just need to replace a cipher, use a larger key, and so forth. However, in real-world settings, this is not that simple. In particular, if we someday build large-enough quantum computers (QC), our Internet Security will be thrown out the window.
- What is the plan for all ciphers securing our assets based on the integer factorization or discrete logarithm problems such as RSA, ECDSA, ECDH, and more?
- The real question is how to adapt our software systems to cope with this cryptographic evolution (and/or future evolutions)?
Today at RWC’22, David Ott gave a talk with the title:
“Where Is the Research on Cryptographic Transition and Agility?”
I cannot agree more with David in that regard; there is even a dedicated subsection in my thesis about cryptographic agility (see Section 5.2.2).
This is not an easy problem. For example, previously at RWC’20, ABN AMRO Bank reported that it took the organization almost more than a year to discover and replace their obsolete certificates based on SHA-1 (which is of course a no-go today). The process was painful, but it was just SHA-1. Quantum computers can potentially break almost many of today’s public-key cryptography deployments. How will we get there?
Quantum computers break today’s public-key crypto. Will we have sufficiently powerful quantum computers? That’s a different story. It is important to know that many of today’s cipher suites are going to be broken, thanks to Peter Shor and many other people after him. We also know that there are many scientists working days and nights on quantum physics, etc. to make quantum computers a thing. The whole quantum computers be a conspiracy or not, if we can have new ciphers capable of withstanding post-quantum threats, then just why not?
While looking into one of David Ott’s papers after his talk, I stumbled upon a table published by NIST. In summary:
- algorithms like AES, SHA-{2,3} require larger key sizes and outputs respectively, and
- RSA, EC cryptography (ECDSA, ECDH, etc.), DSA, and others will be no longer secure.
NIST competition. NIST has taken these threats seriously and has announced a call for post-quantum cryptography (PQC) algorithm proposals for standardization in a 6-year-long selection process.
Urgency to many organizations. Companies and organizations also consider these threats significant. As David explained, the timeline for having quantum computers is unclear; we do not have a certain date. History has shown that full cryptographic migrations, such as the 3DES-to-AES upgrade, take a decade. Therefore, the PQC migration is a complex process. Lastly, there is a possibility that a malicious party performs record-now-exploit-later attacks.
The PQC algorithms come with various properties:
- key sizes,
- ciphertext sizes,
- signature sizes,
- communication requirements, and
- computational requirements.
This might initially seem to be unimportant; however, these schemes have been used in various settings varying from high-end systems to embedded systems in IoT. Therefore, a change in each of these aspects might have a ripple effect throughout the entire system stack.
(My) concluding remarks. This is an extremely complicated challenge for the corporations with complicated and distributed software and networking stack, and in general for the entire Internet. Security architecture should be as modular as possible, and there should be guidelines, policies, and the right toolings for graceful upgrades. It is easy to outline some remarks; however, execution and realization of these requirements require a ton of research by practitioners and academics from various communities (software engineering, systems, security, cryptography, etc.), otherwise, we will end up using completely broken/outdated primitives forever such as 3DES as cipher and COBOL as a programming language.
I recommend reading this paper by David Ott and his co-authors.