Monthly Archives: November 2012

Quantum Computing Taxonomy

The dodo and analog computers are best pals in the museum iles of the extinct. Some argue this spares us the trouble to try to figure out, in what way they might be Turing complete .

Quantum computers are not created equal. In this, they also differ drastically from their common classical counterparts. Since analog computers went the way of the dodo, our current digital devices can all be modeled as Turing machines. This means that from a computational standpoint they are equivalent and can not perform on the realm of what is now referred to as hyper-computation . David Deutsch's seminal paper that jump started modern quantum computing research followed this lead by considering a Turing machine under quantum state superposition.

This line of thought was extremely fruitful (e.g. Shor's algorithm) and began to overshadowed another approach more in line with Feynman's original idea: I.e. a quantum simulator: A system that is quantum mechanical in nature but can be configured to simulate various scenarios.

This dichotomy came back to haunt D-Wave, and contributed to the harsh rejection on the part of many researchers in the field when they introduced their impressive machines, that where firmly in the mold of a configurable quantum simulator.

Albeit a very different hardware realization, the D-Wave simulator is very much in the same league as the notable architecture developed at the NIST (don't get confused by the use of the word crystal in the linked article, what's meant is a grid). Both accomplished scalable devices to solve for a generalized Ising equation.

Interestingly, the NIST researchers didn't have to face the kind of blow-back that D-Wave had to endure, which just goes to show that sometimes no marketing is the better marketing. These quantum simulator devices are in a sense mature technology and have proven to be scalable. D-Wave's Rose's law is a nice illustration for that (albeit somewhat exaggerated . The company managed to to double the qubit count on their chips about every 15 months for the last 10 years. On the other hand the holy grail of quantum computing, a truly universal machine in the mould of Deutsch's Turing model is harder to come by. There have been impressive, albeit small scale demonstrations of Grover's and Shor's algorithm, but with the latter's record achievement standing at the factorization of just 21, it is clear that there is still a long road ahead.

Intriguingly, adiabatic quantum computing the concept that is so successfully exploited by the D-Wave company and the NIST, also allows for universal computing: An alternative factorization algorithm has been demonstrated to crack this task for the number 143 . Yet, it is unclear if universal adiabatic quantum computing can achieve the quantum speed-up that is theoretically guaranteed when implementing Shor's algorithm.

On the other hand it is a certainty that the experimental set-up used in the record breaking experiment will not scale. (There are only so many hydrogen nuclei in molecules of 1-bromo-2-chlorobenzene). There is a solid theoretical framework that indicates that universal adiabatic quantum computing may indeed be equivalent to gate based QC, but then there are also some tentative numerical simulations that indicate that this may not pan out (comments and wagers on this issue are most welcome in the comment section).

Universal gate based quantum computing relies on quantum error correction to achieve scalability. It is clearly an uphill battle against decoherence. This motivated the topological quantum computing approach that requires anyons ( majorana quais-particles) to implement a regime where an ingrained quantum symmetry counter-acts decoherence. As the latter only very recently have been experimentally shown to exist, this concept is the furthest away from realization.

So without any claim to completeness and with the caveat that it is difficult to make predictions, especially about the future, here is my Gartner inspired quantum computing taxonomy graphic (caveat: it uses the SVG format and may not display in older browsers):

Click on the graph for a larger image (also if you use Google Chrome, the browser sometimes seems to render embedded SVG only after directly going to the image file, please let me know in the comment section if you see any strange SVG effects). If there is more than one label the links are only on the text rather than the entire box. Here is how to read the graph: The axis are fairly straightforward, the more mature and scalable the technology the better. Maturity is measured in estimated time to market. The numbers are pure conjecture on my part (please feel free to submit your own estimates in the comment section). The following legends explain the color and icon choices:

Hardware   Architecture
Ion Trap Adiabatic Ising (Quantum Simulator)
Photonic Laser Warning Icon Adiabatic Universal
SQUIDs Gate Based
Other Solid State Topological

Most architectures can be realized with a variety of very different quantum systems, the latter represent the hardware of a quantum system and are listed on the left. The most universal systems (theoretically guaranteed to provide quantum speed-ups) are able to implement quantum circuits such as Shore's algorithm and are depicted in green.  The quantum simulator family is kept in blue, the universal adiabatic architecture yellow. 

Please let me know if there is an effort that you feel should be placed on this chart. I believe there are many that I missed to penciled in. To me this is just a starting point and first draft. I plan to retain this chart in one way or another to keep track of the various diverse QC efforts and I will be happy to incorporate input.

After writing this blog entry a short google search revealed that R.R. Tucci beat me to the punch of using the taxonomy headline (by a mere four years).  So here's a shout-out to his classification of quantum computing software for the common PC (and Mac).

Physics and Math – a Complicated Love Affair

Usually I restrict myself to physics on this blog.  Not because I don't love and adore pure math, but mostly because news from this field leaves most readers in about the same state as captured in this comic.

Take the recent proof of the ABC conjecture, for instance.  By all accounts, Shinichi Mochizuki,  the mathematician who came up with it, is brilliant.  Unfortunately, that means that even experts close to his field have a  hard time following and validating the proof that spans several arcane areas and four pre-print papers.

But I make an exception for the recent news about a new way to solve linear equations (proposed by Prasad Raghavendra). Everybody learned in school how to solve via Gauss elimination, which might be the most ubiquitous math skill after simple arithmetic.  There have been some algorithms developed to improve on this method, but it is still extremely cool that there is a new algorithm for such a fundamental mathematical problem with such widespread applications.

And because this concerns an algorithm, there is even pseudo and real code to play with.

What is especially interesting about this new method is that it employs randomness (a resource that has been shown many times to allow for more efficient algorithms).  But there is a catch:  This approach is only applicable for linear equation systems over finite fields (so applying this to QC simulation won't work).  Fields are defined as Rings (i.e. commutative groups that have a two binary operations such as addition and multiplication).

This finite field constraint seems again to push this onto a more esoteric plane, but fortunately these structured sets are anything but.  This becomes clear when looking at the simplest finite field that just consists of two members 0 and 1:  Just add the logical operations AND and XOR - et voila - if you've been exposed to bit-wise coding you're intimately familiar with the smallest finite field GF(2). GF stands for Galois Field, another name for the same thing.  (The number denotes how many elements the field has i.e. its cardinality).

Input Output Input Output
0 0 0 0 0 0
0 1 1 0 1 0
1 0 1 1 0 0
1 1 0 1 1 1
These familiar logic gate operations turn a bit set into the smallest finite field GF(2).

It shouldn't come as a surprise that finite fields find application in coding theory and cryptography. There are also two connections to the main topic of this blog: The obvious one is that any randomized algorithm can benefit from the precursors of all Quantum Information devices i.e. quantum noise generators (you can buy one here, but if you need true randomness and cannot afford one of these, then this service might be for you).

The second connection is more subtle and interesting.  Just as for any other calculation device, large universal gate based quantum computers can only become feasible if there are robust error correction mechanisms in place to deal with inevitable random noise. One of the earliest quantum error correction schemes was published by the veritable Peter Shor.  Turns out the basis for "stabilizer codes" that fix the error for a single qbit are again the Pauli operators, which in turn map to GF(4).  This can be generalized to GF(q) for non-binary systems. So it seems this new algorithm could very well find application in this area. A somewhat ironic twist if a randomized algorithm should turn out to be an efficient tool to deploy in correcting random qbit errors.