Quantum computers are not created equal. In this, they also differ drastically from their common classical counterparts. Since analog computers went the way of the dodo, our current digital devices can all be modeled as Turing machines. This means that from a computational standpoint they are equivalent and can not perform on the realm of what is now referred to as hyper-computation .
David Deutsch's seminal paper that jump started modern quantum computing research followed this lead by considering a Turing machine under quantum state superposition. This line of thought was extremely fruitful (e.g. Shor's algorithm) and began to overshadowed another approach more in line with Feynman's original idea: I.e. a quantum simulator: A system that is quantum mechanical in nature but can be configured to simulate various scenarios.
This dichotomy came back to haunt D-Wave, and contributed to the harsh rejection on the part of many researchers in the field when they introduced their impressive machines, that where firmly in the mold of a configurable quantum simulator. Albeit a very different hardware realization, the D-Wave simulator is very much in the same league as the noteable architecture developed at the NIST (don't get confused by the use of the word crystal in the linked article, what's meant is a grid). Both accomplished scalable devices to solve for a generalized Ising equation (which follows from the, notoriously difficult to simulate, solid state model of the same name ).
Interestingly, the NIST researchers didn't have to face the kind of blow-back that D-Wave had to endure, which just goes to show that sometimes no marketing is the better marketing.
These quantum simulator devices are in a sense mature technology and have proven to be scalable - D-Wave's Rose's law is a nice illustration for that (albeit somewhat exaggerated . The company managed to to double the qubit count on their chips about every 15 months for the last 10 years.
On the other hand the holy grail of quantum computing, a truly universal machine in the mould of Deutsch's Turing model is harder to come by. There have been impressive, albeit small scale demonstrations of Grover's and Shor's algorithm, but with the latter's record achievement standing at the factorization of just 21 , it is clear that there is still a long road ahead. Intriguingly, adiabatic quantum computing the concept that is so successfully exploited by the D-Wave company and the NIST, also allows for universal computing: An alternative factorization algorithm has been demonstrated to crack this task for the number 143 . Yet, it is unclear if universal adiabatic quantum computing can achieve the quantum speed-up that is theoretically guaranteed when implementing Shor's algorithm. On the other hand it is a certainty that the experimental set-up used in the record breaking experiment will not scale. (There are only so many hydrogen nuclei in molecules of 1-bromo-2-chlorobenzene). There is a solid theoretical framework that indicates that universal adiabatic quantum computing may indeed be equivalent to gate based QC, but then there are also some tentative numerical simulations that indicate that this may not pan out (comments and wagers on this issue are most welcome in the comment section).
Universal gate based quantum computing relies on quantum error correction to achieve scalability. It is clearly an uphill battle against decoherence. This motivated the topological quantum computing approach that requires anyons ( majorana quais-particles) to implement a regime where an ingrained quantum symmetry counter-acts decoherence. As the latter only very recently have been experimentally shown to exist, this concept is the furthest away from realization.
So without any claim to completeness and with the caveat that it is difficult to make predictions, especially about the future, here is my Gartner inspired quantum computing taxonomy graphic (caveat: it uses the SVG format and may not display in older browsers):
Click on the graph for a larger image (also if you use Google Chrome, the browser sometimes seems to render embedded SVG only after directly going to the image file, please let me know in the comment section if you see any strange SVG effects).
If there is more than one label the links are only on the text rather than the entire box.
Here is how to read the graph: The axis are fairly straightforward, the more mature and scalable the technology the better. Maturity is measured in estimated time to market. The numbers are pure conjecture on my part (please feel free to submit your own estimates in the comment section). The following legends explain the color and icon choices:
Please let me know if there is an effort that you feel should be placed on this chart. I believe there are many that I missed to penciled in. To me this is just a starting point and first draft. I plan to retain this chart in one way or another to keep track of the various diverse QC efforts and I will be happy to incorporate input.
After writing this blog entry a short google search revealed that R.R. Tucci beat me to the punch of using the taxonomy headline (by a mere four years). So here's a shout-out to his classification of quantum computing software for the common PC (and Mac).