Big Bad Quantum Computer Revisited

A recent DefenseNews article again put Shor’s algorithm front and center when writing about Quantum Computing.  Yet, there is so much more to this field, and with Lockheed Martin working with D-Wave one would expect this to be an open secret in the defence sector.  At any rate, this is a good enough reason to finally publish the full text of my quantum computing article, that the Hakin9 magazine asked me to write for their special issue earlier this year:

Who’s afraid of the big bad Quantum Computer?

“Be afraid; be very afraid, as the next fundamental transition in computing technology will obliterate all your encryption protection.”

If there is any awareness of quantum computing in the wider IT community then odds are it is this phobia that is driving it.  Probably Peter Shor didn’t realize that he was about to pigeonhole the entire research field when he published his work on what is now the best known quantum algorithm. But once the news spread that he uncovered a method that could potentially speed up RSA decryption, the fear factor made it spread far and wide. Undoubtedly, if it wasn’t for the press coverage that this news received, quantum information technology research would still be widely considered to be just another academic curiosity.

So how realistic is this fear, and is breaking code the only thing a quantum computer is good for?  This article is an attempt to separate fact from fiction

First let’s review how key exchange protocols that underlie most modern public key encryption schemes accomplish their task. A good analogy that illustrates the key attribute that quantum computing jeopardizes is shown in the following diagram (image courtesy of Wikipedia):

Diffie-Hellman_Key_Exchange

Let’s assume we want to establish a common secret color shared by two individuals, Alice and Bob – in this example this may not be a primary color but one that can be produced as a mix of three other ones. The scheme assumes that there exists a common first paint component that our odd couple already agreed on. The next component is a secret private color. This color is not shared with anybody. What happens next is the stroke of genius, the secret sauce that makes public key exchange possible. In our toy example it corresponds to the mixing of the secret, private color with the public one. As everybody probably as early as kindergarten learned, it’s easy to mix colors, but not so easy – try practically impossible – to revert it. From a physics standpoint the underlying issue is that entropy massively increases when the colors are mixed. Nature drives the process towards the mixed state, but makes it very costly to reverse the mixing. Hence, in thermodynamics, these processes are called “irreversible”.

This descent into the physics of our toy example may seem a rather pointless digression, but we will see later that in the context of quantum information processing, this will actually become very relevant.

But first let’s get back to Alice and Bob.  They can now publicly exchange their mix-color, safe in the knowledge that there are myriads of ways to get to this particular shade of paint, and that nobody has much of a chance of guessing their particular components.

Since in the world of this example nobody has any concept of chromatics, even if a potential eavesdropper were to discover the common color, they’d still be unable to discern the secret ones, as they cannot unmix the publicly exchanged color shades.

In the final step, Alice and Bob recover a common color by adding their private secret component. This is a secret that they now share to the exclusion of everybody else.

So how does this relate to the actual public key exchange protocol?  We get there by substituting the colors with numbers, say x and y for the common and Alice’s secret color. The mixing of colors corresponds to a mathematical function G(x,y).  Usually the private secret numbers are picked from the set of prime numbers and the function G is simply a multiplication, exploiting the fact that integer factorization of large numbers is a very costly process. The next diagram depicts the exact same process, just mapped on numbers in this way.

key_exchange_drawing2

If this simple method is used with sufficiently large prime numbers then the Shared Key is indeed quite safe, as there is no known efficient classical algorithm, that allows for a reasonably fast integer factorization.  Of course “reasonably fast” is a very fuzzy term, so let’s be a bit more specific: There is no known classical algorithm that scales polynomially with the size of the integer. So for instance, an effort to crack a 232-digit number (RSA-768) that concluded in 2009 took the combined CPU power of hundreds of machines (Intel Core2 equivalents) over two years to accomplish.

And this is where the quantum computing bogeyman comes into the picture, and the aforementioned Peter Shor.  This affable MIT researcher formulated a quantum algorithm almost twenty years ago that can factorize integers in polynomial time on the, as of yet elusive, quantum hardware. So what difference would that actually make?  The following graph puts this into perspective:

scaling complexity graph
Scaling with Z3 (red curve) versus $ (z^2 \sqrt[3]{z} ) ~ 4 / (3^{5/2}) $ (green). The latter will look almost like a vertical ascent.
Z encodes stands for the logarithmic value of the size of the integer. The purple curve appears as almost vertical on this scale because the necessary steps (y-axis) in this classic algorithm grow explosively with the size of the integer. Shor’s algorithm, in comparison, shows a fairly well behaved slope with increasing integer sizes, making it theoretically a practical method for factoring large numbers.

And that is why common encryptions such as RSA could not protect against a deciphering attack if a suitable quantum computer was to be utilized.  So now that commercial quantum computing devices such as the D-Wave One are on the market, where does that leave our cryptographic security?

First off: Not all quantum computers are created equal. There are universal gate-based ones, which are theoretically probably the best understood, and a textbook on the matter will usually start introducing the subject matter from this vantage point.  But then there are also quantum simulators, topological design and adiabatic ones (I will forgo quantum cellular automatons in this article).  The only commercially available machine, i.e. D-Wave’s One, belongs to the latter category but is not a universal machine, in that it cannot simulate any arbitrary Hamiltonian (this term describes the energy function that governs a quantum system).  Essentially this machine is a super-fast and accurate solver for only one class of equations.  This kind of equation was first written out for describing solid state magnets according to what it now called the Ising model.

But fear not: The D-Wave machine is not suitable for Shor’s algorithm.  The latter requires a gate programmable device (or universal adiabatic machine) that provides plenty of qbits.  The D-Wave one falls short on both ends.  It has a special purpose adiabatic quantum chip with 128 qbits. Even if the architecture were compatible with Shor’s algorithm, the amount of qbits falls far short: If N is the number we want to factorize then we need a bit more than the square of that number in terms of qbits.  Since the integers we are interested in are pretty large, this is far outside anything that can be realized at this point.  For instance, for the RSA-768 challenge mentioned earlier, more than 232²=53824 qbits are required.

So you may wonder, what good is this vanguard of the coming quantum revolution if it can’t even handle the most famous quantum algorithm? To answer this let’s step back and look at what motivated the research into quantum computing to begin with. It wasn’t the hunt for new, more powerful algorithms but rather the insight, first formulated by Richard Feynman, that quantum mechanical systems cannot be efficiently simulated on classical hardware.  This is, of course, a serious impediment as our entire science driven civilization depends on exploiting quantum mechanical effects.  I am not even referring to the obvious culprits such as semiconductor based electronics, laser technology etc. but the more mundane chemical industry.  Everybody will probably recall the Styrofoam models of orbitals and simple molecules such as benzene C6H6:

Benzol_Representationen

As the graphic illustrates, we know that sp2 orbitals facilitate the binding with the hydrogen, and that there is a delocalized π electron cloud formed from the overlapping p2 orbitals.  Yet, these insights are inferred (and now thanks to raster electron microscopy also measured) but they don’t flow from an exact solution of the corresponding Schrödinger equations that govern the physics of these kinds of molecules.

Granted, multi-body problems don’t have an exact solution in the classical realm either, but the corresponding equations are well behaved when it comes to numerical simulations.  The Schrödinger equation that rules quantum mechanical systems, on the other hand, is not. Simple scenarios are still within reach for classical computing, but not so larger molecules (i.e. the kind that biological processes typically employ). Things get even worse when one wants to go even further and model electrodynamics on the quantum level.  Quantum field theories require a summation over an infinite regime of interaction paths – something that will bring any classical computer to its knees quickly. Not a quantum computer, though.

This summer a paper was published (Science June 1st issue) that showed conclusively that for this new breed of machine a polynomial scaling of these notorious calculations is indeed possible. (As for String theory simulations, the jury is still out on that – but it has been suggested that maybe it should be considered as an indication of an unphysical theory if a particular flavor of a String theory cannot be efficiently simulated on a quantum computer).

Quantum Computing has, therefore, the potential to usher in a new era for chemical and nano-scale engineering, putting an end to the still common practice of having to blindly test thousands of substances for pharmaceutical purposes, and finally realizing the vision, of designing smart drugs that specifically match targeted receptor proteins.

Of course, even if you can model protein structures, you still need to know which isomer is actually the biologically relevant one. Fortunately, a new technology deploying electron holography is expected to unlock a cornucopia of protein structure data.  But this data will remain stale if you cannot understand how these proteins can fold. The latter is going to be key for understanding the function of a protein within the living organism.

Unfortunately, simulating protein folding has been shown to be an NP hard problem. Quantum computing is once again coming to the rescue, allowing for a polynomial speed-up of these kinds of calculations.  It is not an exaggeration to expect that in the not too distant future lifesaving drug development will be facilitated this way. And first papers using D-Wave’s machine in this way have been published.

This is just one tiny sliver of the fields that quantum computing will impact.  Just as with the unexpected applications that ever-increasing conventional computing power enabled, it is safe to say that we, in all likelihood, cannot fully anticipate how this technological revolution will impact our lives.  But we can certainly identify some more areas that will immediately benefit from it: Artificial Intelligence, graph theory, operational research (and its business applications), database design etc. One could easily file another article on each of these topics while only scratching the surface, so the following observations have to be understood as extremely compressed.

It shouldn’t come as a surprise that quantum computing will prove fruitful for artificial intelligence.  After all, one other major strand that arguably ignited the entire research field was contemplations on the nature of the human mind.  The prominent mathematician and physicist Roger Penrose, for instance, argued vehemently that the human mind cannot be understood as a classical computer i.e. he is convinced (almost religious in his certainty) that a Turing machine in principle cannot emulate a human mind.  Since it is not very practical to try to put a human brain into a state of controlled quantum superposition, the next best thing is to think this through for a computer.  This is exactly the kind of thought experiment that David Deutsch discussed in his landmark paper on the topic.  (It was also for the first time that a quantum algorithm was introduced, albeit not a very useful one, demonstrating that the imagined machine can do some things better than a classical Turing machine).

So it is only fitting that one of the first demonstrations of D-Wave’s technology concerned the training of an artificial neural net.  This particular application maps nicely onto the structure of their system, as the training is mathematically already expressed as the search for a global minimum of an energy function that depends on several free parameters.  To the extent that an optimization problem can be recast in this way, it becomes a potential candidate to benefit from D-Wave’s quantum computer.  There are many applicable use cases for this in operational research (i.e. logistics, supply chain etc.) and business intelligence.

While this is all very exciting, a skeptic will rightfully point out that just knowing a certain tool can help with a task does not tell us how well it will stack up to conventional methods.  Given the price tag of $10 million, it had better be good.  There are unfortunately not a lot of benchmarks available, but a brute force search method implemented to find some obscure numbers from graph theory (Ramsey numbers) gives an indication that this machine can substitute for some considerable conventional computing horsepower i.e. about 55 MIPS, or the equivalent of a cluster of more than 300 of Intel’s fastest commercially available chips.

Another fascinating aspect that will factor into the all-important TCO (Total Cost of Ownership) considerations is that a quantum computer will actually require far less energy to achieve this kind of performance (its energy consumption will also only vary minimally under load).  Earlier I described a particular architecture as adiabatic, and it is this term that describes this counterintuitive energy characteristic.  It is a word that originated in thermodynamics and describes when a process progresses without heat exchange. I.e. throughout most of the QC processing there is no heat-producing entropy increase.  At first glance, the huge cooling apparatus that accompanies a quantum computer seems to belie this assertion, but the reason for this considerable cooling technology is not a required continuous protection of the machines from over-heating (like in conventional data centers) but because most QC implementations require an environment that is considerably colder than even the coldest temperature that can be found anywhere in space (the surface of Pluto would be outright balmy in comparison).

Amazingly, these days commercially available Helium cooling systems can readily achieve these temperatures close to absolute zero.  After cooling down, the entire remaining cooling effort is only employed to counteract thermal flow that even the best high vacuum insulated environments will experience.  The quantum system itself will only dissipate a minimal amount of heat when the final result of an algorithm is read out. That is why the system just pulls 15 KWatt in total.  This is considerably less than what our hypothetical 300 CPU cluster would consume under load i.e. >100KW per node, more than double D-Wave’s power consumption.  And the best part: The cooling system, and hence power consumption, will remain the same for each new iteration of chips (D-Wave recently introduced their new 512 qbits VESUVIUS chip) and so far steadily followed their own version of Moore’s law, doubling integration about every 15 months.

So although D-Wave’s currently available quantum computing technology cannot implement Shor’s algorithm, or the second most famous one, Grover’s search over an unstructured list, the capabilities it delivers are nothing to scoff at. With heavyweights like IBM pouring considerable R&D resources into this technology, fully universal quantum processors will hit the market much earlier than most IT analysts (such as Gartner) currently project. Recently IBM demoed a 4 qbit universal chip (interestingly using the same superconducting foundry approach as D-Wave). If they also were to manage a doubling of their integration density every 18 months then we’d be looking at 256 qbit chips within three years.

While at this point current RSA implementation will not be in jeopardy, this key exchange protocol is slowly reaching its end-of-life cycle.  So how best to mitigate against future quantum computing attacks on the key exchange?  The most straightforward approach is simply to use a different “color-mixing” function than integer multiplication i.e. a function that even a quantum computer cannot unravel within a polynomial time frame. This is an active field of research, but so far no consensus for a suitable post-quantum key exchange function has evolved. At least it is well established that most current symmetric crypto (cyphers and hash functions) can be considered secure from the looming threat.

As to key exchange, the ultimate solution can also be provided by quantum mechanics in the form of quantum cryptography that in principle allows to transfer a key in such a manner that any eavesdropping will be detectable. To prove that this technology can be scaled for global intercontinental communication, the current record holder for the longest distance of quantum teleportation, the Chinese physicist Juan Yin, plans to repeat this feat in space, opening up the prospect for ultra secure communication around the world.  Welcome to the future.

4 thoughts on “Big Bad Quantum Computer Revisited

  1. The article reminded me of the “Arnold Anold” scare in the 198o’s – that a claimed mathemaical advance would wipe out encryption techniques. See here
    http://alturl.com/52spu
    (In case that disappears use google.com/search?hl=en&q=”arnold arnold” factorization)

    The idea that encryption techniques would become useless overnight is scary both for banking and finance. I wonder if any such organisation has this risk on their radar?

    1. Shore’s algorithm’s notoriety is certainly due to its impact on encryption and I would expect an IT security expert worth his salt to be aware of this – e.g. Hakin9 is not a science periodical but is covering exclusively computing security issues.

      On the other hand most successful security exploits are due to social engineering. We humans are unfortunately so easily fooled that a brute force cryptographic attack is usually not at all required.

Comments are closed.