Quantum Computing Taxonomy

The dodo and analog computers are best pals in the museum iles of the extinct. Some argue this spares us the trouble to try to figure out, in what way they might be Turing complete .

Quantum computers are not created equal. In this, they also differ drastically from their common classical counterparts. Since analog computers went the way of the dodo, our current digital devices can all be modeled as Turing machines. This means that from a computational standpoint they are equivalent and can not perform on the realm of what is now referred to as hyper-computation . David Deutsch’s seminal paper that jump started modern quantum computing research followed this lead by considering a Turing machine under quantum state superposition.

This line of thought was extremely fruitful (e.g. Shor’s algorithm) and began to overshadowed another approach more in line with Feynman’s original idea: I.e. a quantum simulator: A system that is quantum mechanical in nature but can be configured to simulate various scenarios.

This dichotomy came back to haunt D-Wave, and contributed to the harsh rejection on the part of many researchers in the field when they introduced their impressive machines, that where firmly in the mold of a configurable quantum simulator.

Albeit a very different hardware realization, the D-Wave simulator is very much in the same league as the notable architecture developed at the NIST (don’t get confused by the use of the word crystal in the linked article, what’s meant is a grid). Both accomplished scalable devices to solve for a generalized Ising equation.

Interestingly, the NIST researchers didn’t have to face the kind of blow-back that D-Wave had to endure, which just goes to show that sometimes no marketing is the better marketing. These quantum simulator devices are in a sense mature technology and have proven to be scalable. D-Wave’s Rose’s law is a nice illustration for that (albeit somewhat exaggerated . The company managed to to double the qubit count on their chips about every 15 months for the last 10 years. On the other hand the holy grail of quantum computing, a truly universal machine in the mould of Deutsch’s Turing model is harder to come by. There have been impressive, albeit small scale demonstrations of Grover’s and Shor’s algorithm, but with the latter’s record achievement standing at the factorization of just 21, it is clear that there is still a long road ahead.

Intriguingly, adiabatic quantum computing the concept that is so successfully exploited by the D-Wave company and the NIST, also allows for universal computing: An alternative factorization algorithm has been demonstrated to crack this task for the number 143 . Yet, it is unclear if universal adiabatic quantum computing can achieve the quantum speed-up that is theoretically guaranteed when implementing Shor’s algorithm.

On the other hand it is a certainty that the experimental set-up used in the record breaking experiment will not scale. (There are only so many hydrogen nuclei in molecules of 1-bromo-2-chlorobenzene). There is a solid theoretical framework that indicates that universal adiabatic quantum computing may indeed be equivalent to gate based QC, but then there are also some tentative numerical simulations that indicate that this may not pan out (comments and wagers on this issue are most welcome in the comment section).

Universal gate based quantum computing relies on quantum error correction to achieve scalability. It is clearly an uphill battle against decoherence. This motivated the topological quantum computing approach that requires anyons ( majorana quais-particles) to implement a regime where an ingrained quantum symmetry counter-acts decoherence. As the latter only very recently have been experimentally shown to exist, this concept is the furthest away from realization.

So without any claim to completeness and with the caveat that it is difficult to make predictions, especially about the future, here is my Gartner inspired quantum computing taxonomy graphic (caveat: it uses the SVG format and may not display in older browsers):

Click on the graph for a larger image (also if you use Google Chrome, the browser sometimes seems to render embedded SVG only after directly going to the image file, please let me know in the comment section if you see any strange SVG effects). If there is more than one label the links are only on the text rather than the entire box. Here is how to read the graph: The axis are fairly straightforward, the more mature and scalable the technology the better. Maturity is measured in estimated time to market. The numbers are pure conjecture on my part (please feel free to submit your own estimates in the comment section). The following legends explain the color and icon choices:

Hardware   Architecture
Ion Trap Adiabatic Ising (Quantum Simulator)
Photonic Laser Warning Icon Adiabatic Universal
SQUIDs Gate Based
Other Solid State Topological

Most architectures can be realized with a variety of very different quantum systems, the latter represent the hardware of a quantum system and are listed on the left. The most universal systems (theoretically guaranteed to provide quantum speed-ups) are able to implement quantum circuits such as Shore’s algorithm and are depicted in green.  The quantum simulator family is kept in blue, the universal adiabatic architecture yellow. 

Please let me know if there is an effort that you feel should be placed on this chart. I believe there are many that I missed to penciled in. To me this is just a starting point and first draft. I plan to retain this chart in one way or another to keep track of the various diverse QC efforts and I will be happy to incorporate input.

After writing this blog entry a short google search revealed that R.R. Tucci beat me to the punch of using the taxonomy headline (by a mere four years).  So here’s a shout-out to his classification of quantum computing software for the common PC (and Mac).

8 thoughts on “Quantum Computing Taxonomy

  1. I like the graphs specially the hardware-architecture one. It is a good summary of potential QC’s!

    1. Thank you! I hope arranging the various approaches along these two dimensions can clarify the development somewhat. I am sure there is still quite a bit of QC efforts out there that I am not aware of and that should be on the graph.

      I’ll take it the graphs displays properly in your browser? I find that Google Chrome sometimes doesn’t render the icons for the simulation and photonics (i.e. the laser warning). SVG has been around for a long time now, but just as with MathML, it seems browser support is still anything but solid.

  2. My comment in here may be a bit funny as my knowledge of QC is very basic but I am curious to know about the physical and material science aspects of QC and how their parts (the hard-drive? CPU? etc) would in practice look like. I saw a picture of a chip in Wikipedia entry for QC (by D-Wave) and I did not understand how it works, but are you aware of any books or articles discussing the material aspects of a QC?

    P.S.: The graphs look good in Safari and IE.

    1. My intend with this blog is to reach beyond the QC “in-crowd”, so by all means, all pertinent questions are more than welcome (would have answered earlier if my day job and family wouldn’t have kept me too busy).

      There is one defining peculiarity of QCs that make any hardware fundamentally different from normal computers and that is the fickle nature of the qbits: A harddrive for them is out of the question because of the no cloning theorem. So in essence a QC will always be set-up following instructions stored as a conventional data file (that is also how the D-Wave One is currently operating). At best you could have a quantum network that allows for the prepping of qbits in one place and then quantum-teleport the real thing to be processed by a QC. In that manner you can also have kind of a qbit bus architecture. Something that lead the UCBS claim to have a “von Neumann” QC architecture. I find this to be a rather poor choice of words as it really just helps to enhance the confusion about how fundamentally different qbits are.

    2. As for QC hardware books I must admit I don’t know of any that really approach the subject from this angle.

      The hardware resources that are experimented with differ greatly, trapped ions to my knowledge were the first ones, then there is photonics. The SQUID technology that pioneered an entire new material class for industrial scale chip lithography .

      And then of course there is the search for 2d materials that could yield majorana quais-particles.

      To understand the D-Wave technology in particular probably a good starting point would be to read about Josephson junctions. (They are my favorite quantum system).

  3. Overall this is a very good article. However, since SQUIDs are solid-state devices I am not sure why SQUIDs and Solid State are classified as two categories (think about the case of white horse and horse). Maybe Solid State should be changed to Other Solid State.

    1. Good point, I corrected this. There are quite a few other ideas out there e.g. quantum dots, that as they mature should be broken out separately. Right now with regards to topological quantum computing, I think it is too early to ascertain which approach will make the cut to reliably provide anyon quasi particles. Yet, it’ll be quite certainly solid state.

Comments are closed.