
Every year we have more powerful computers with a higher calculation capacity, is that fact good or bad? Think twice before giving an answer.
It depends. Because if global information security is based on the computing complexity of some functions, then the fact that computers are becoming ever faster will be very bad news.
In fact, the sword of Damocles is hanging over our public-key encryption systems: RSA, DSA, ECDSA. Their security relies on the difficulty of achieving certain mathematical problems currently considered as untreatable, such as factoring large integers or solving discrete logarithms. The quantum computing big promise is that these mathematical problems will be rapidly solved. Is cryptography mortally wounded? Is there a hope for the world? Will we be able to continue communicating securely after the first quantum computers? Let’s see it step by step.
In Computer science, the objective is to find the most efficient (so, the fastest) algorithm to solve a given computational problem. With the aim to compare the efficiency of different algorithms, they are needed mechanisms for classifying computational problems according to the resources required to solve them. Such classification must be capable of measuring the inherent difficulty of the problem, regardless of any particular computing model. The resources to be measured may include: time, storage space, number of processors, etc. The main approach is usually the time and, sometimes, the space as well.
Unfortunately, predicting the algorithm’s exact execution time regarding any kind of input is often difficult. In such situations, this time is approximated: it is examined how algorithm’s execution time increases as the input seize scales limitlessly.
To represent these execution times and make their comparison easy, the big O notation is usually used: O(f(n)). According to this asymptotic notation, if a running time is O(f(n)), then for large enough n, the running time is at most k·f(n) for some constant k, although it may, of course, grow more slowly. The f(n) function represents the worst-case running time. For having a clearer idea of efficiency, the following list of functions in asymptotic notation is ordered by slowest to fastest growing:
The secret for the speed of future quantum computers will rely on quantum algorithms, which are capable of tapping into qubits superposition. As it has been said, the execution time –for both classical and quantum algorithms– is measured by the number of basic operations used per algorithm. In case of quantum computing, this efficiency may be measured by using the quantum circuit model: a quantum circuit is a sequence of basic quantum operations called quantum gates, each one applied to a small number of qubits.
The nemesis of cryptography is the so-called Shor’s algorithm: the exponentially fastest quantum algorithm when calculating discrete logarithms and factoring large integers, thereby capable of breaking RSA, DSA and ECDSA. Regarding the factoring challenge, given an integer n = p × q for some prime numbers p and q, our task is to determine p and q. The best classical algorithm known, the general number field sieve, runs in time exp(O(ln n)1/3(ln ln n)2/3)), while Shor’s quantum algorithm solves this problem substantially faster, in time O((log n)3).
The Grover’s algorithm is also well-known. This algorithm can speed up searches on unordered data sequences of n elements. Indeed, one of the most basic problems in computer science is unstructured search. This problem can be formalized as follows: given the ability to evaluate a function f: {0, 1}n → {0, 1}, find x such that f(x) = 1, if such x exists; otherwise, output “not found”.
It is easy to see that, with no prior information about f, any classical algorithm which solves the unstructured search problem with certainty must evaluate f a total of N = 2n times in the worst case. Even if we seek a randomized algorithm which succeeds, say, with probability 1/2 in the worst case, then the number of evaluations required is of order N. However, Grover’s quantum algorithm solves this problem using O(N1/2) evaluations of f in the worst case. As you can see, it is not as dramatically fast as the Shor’s algorithm, nor does it constitute such a serious threat, since it can be counteracted just by doubling the key size.
Currently, it seems there are no more algorithms in the quantum cryptanalyst’s repertoire. So, let’s see how Shor and Grover would affect the current cryptography if they could be executed nowadays.
In the last entry we saw that the first error-corrected quantum computers of several thousands of logical qubits are expected to be built, at least, in 10 years. What would the state of classical cryptography be when facing quantum attacks?
- Asymmetric-key cryptography
Let’s provide a context for these data: to break a 1024-bit RSA key would require a quantum computer that has around 2,300 logical qubits and less than 4 hours of computing time. It is a matter of such seriousness that the U.S. National Institute of Standards and Technology (NIST) has launched a project called Post-Quantum Cryptography, in order to look for options that could replace the current algorithms. The selection process for candidates is expected to have ended in 5 or 6 years.
- Symmetric-key cryptography
The current standard for symmetric encryption is the algorithm called AES-GCM (Advanced Encryption Standard-Galois/Counter Mode), that supports three variable-key sizes: 128 bits, 192 bits or 256 bits. For instance, for a 128-bit key, a brute-force attack requires trying all the possible values of the key, in other words: 2128 combinations. The Grover’s algorithm can quadratically speed this search up, meaning that it would require 264 operations. Consequently, to execute it on a quantum computer would require around 3,000 qubits and more than 1012 years, an extremely long time. Even if such quantum computer existed already, the solution would be as simple as doubling the key size, something that could be done in a practical way at any time.
Of course, someone could discover a quantum algorithm much more efficient than the Grover’s. In such a case, AES-GCM would have to be replaced.
- Hash
Hash functions are used in all type of applications: password hashing in databases, mathematical problems for bitcoin mining, building of data structures such as the Merkle three, message digests for digital signatures, password-based key derivation functions, etc. SHA256 remains the most frequently used algorithm nowadays. Current hashing algorithms are not expected to be impacted by quantum computing, since the Grover’s algorithm is considered not to be capable of breaking a hash like SHA256.
However, password hashing is at a higher risk because the space of user passwords is not very large. For example, given a 100-symbol alphabet, the set of all 10-character passwords is only about 10010 ≈ 266. Using Grover’s algorithm, the running time shrinks to only 233, so a hypothetical quantum computer may take just a few seconds. The quantum threat shadow is another reason to move towards alternatives to passwords. Examples of this are the projects on which our ElevenPaths’ Innovation and Labs team is currently working: CapaciCard, SmartPattern, PulseID or Mobile Connect.
Another widespread hash application is the proof-of-work systems, used by cryptocurrencies such as Bitcoin or Ethereum. To validate a chain block, miners must solve a mathematical puzzle that involves calculating millions of hashes. Fortunately for Blockchain, a quantum computer would need more than 10 minutes to solve the current challenges, so cryptocurrencies will remain safe, at least on that side (but not on the elliptic curve cryptography’s one).
![]() |
Summary table on the estimations to break classical cryptographic through quantum algorithms, and potential countermeasures (Source: Quantum Computing: Progress and Prospects) |
It can be said that modern cryptography was born in the mid-70s, with algorithms such as DES, for symmetric encryption, and Diffie-Hellman, for key establishment. Since then, there have been a handful of changeovers from an algorithm widely established to another one. Some of these migrations have been: from DES and Triple-DES to AES; from MD5 and SHA-1 to the SHA-2 family; from the RSA key transport and the Diffie-Hellman finite field to the Diffie-Hellman key exchange elliptic curve; and from the RSA and DSA certificates to the ECDSA certificates.
Some of these changeovers have been successful: AES can be found almost everywhere and modern communication protocols mainly use the ECDH key exchange. However, those changeovers involving public key infrastructure have been unequally successful: browser providers and Certification Authorities have gone through a lengthy transitional period from SHA-1 to SHA-2 on certificates –with repeated delays–, the changeover to elliptic curve certificates has been even more slow and, in spite of it, most of the certificates issued for the web continue using RSA.
In the medium term, it is likely we will be forced to experience a new changeover: towards the post-quantum public key cryptography.
First of all, it is important to spell out that when we talk about ‘the end of RSA, DSA and ECDSA’ and ‘the end of cryptography’, we are talking about two different things. There is cryptographic life beyond RSA, DSA and ECDSA. Since the mid-80s, there have been cryptographic algorithms based on the difficulty in solving mathematical problems different from the integer factorization and the discrete logarithm. The three best-studied alternatives are:
- Hash-based cryptography: as its name suggests, it uses secure hash functions resisting quantum algorithms. The disadvantage is that it generates relatively long signatures, so limiting its use scenarios. Leighton-Micali Signature Scheme (LMSS) is one of the strongest candidates to replace RSA and ECDSA.
- Code-based cryptography: the coding theory is a mathematical specialty on information coding rules. Some coding systems are quite difficult to decode, in a manner that they often require exponential time, even for quantum computers. The best-studied cryptosystem so far has been McEliece, another bright candidate for key exchanging.
- Lattice-based cryptography: it might be considered the most active research field on post-quantum cryptography. A lattice is a discrete set of points in space that has the property that the sum of two points on the lattice is also on the lattice. A difficult problem is to find the Shortest Vector Problem of a given lattice. All classical algorithms require a time that growths exponentially according to the lattice size to be solved, and it is thought that the same will apply to quantum algorithms. Currently, there are several Shortest Vector Problem-based cryptosystems.
Consequently, the great difficulty is not the lack of alternatives. The painful problem will be the time of transition to one of the alternatives previously described:
- Firstly, postquantum algorithms must be selected and standardized by institutions such as the NIST.
- Then, the standard must be incorporated into the cryptographic libraries currently in use by the most popular programming languages, cryptographic chips and hardware modules.
- Afterwards, they must be integrated within the cryptographic standards and protocols, such as PKCS#1, TLS, IPSec, etc.
- After that, all the sellers must include these standards and protocols in their products: from hardware manufacturers to software developers.
- Once all software and hardware products have been updated, it will be necessary to perform again the following actions: issue all the certificates, encrypt all the stored information, sign and distribute the code, get rid of all the old copies, etc.
How long this process will be? Considering previous migrations, such as the SHA-1 and SHA-2 ones, and taking into account the additional complexity, no less than 20 years. When the first quantum computers capable of attacking RSA, DSA and ECDSA are expected to be available? Not before 15 years.
This is the current outlook. Let’s hope that the changeover process will gain momentum. Nobody knows for certain how far are quantum computers. However, just in case, better to be prepared.
The post The post-quantum future is around the corner and we are still not prepared appeared first on Think Big.