If headlines and news articles were all you had to go by when trying to form an opinion about quantum computing, you’d end up with one enormous migraine. For many years now, they have created a constant barrage of conflicting story lines.
For reasons known only to them, science news authors seem to have collectively decided to ignore that there are many competing approaches to quantum computing. This apparent inability to differentiate between architectures and computational models makes for a constant source of confusion, which is then augmented by the challenge to explain the conceptual oddities of quantum computing, such as entanglement.
For instance, most authors, even if they may already know this is wrong, run with the simplest trope about quantum computing, which has been repeated ad nauseum: The pretense that these machines can execute every possible calculation within their input scope in parallel. Hard to imagine a misconception that would be better designed to put up a goalpost that no man-made machine could ever reach. Scott Aaronson is so incensed by this nonsense that it even inspired the title of his new book. It is truly a sorry state of affairs when even Nature apparently cannot find an author who doesn’t fall for it. Elizabeth Gibney’s recent online piece on quantum computing was yet another case in point. It starts off promising, as the subtitle is spot on:
After a 30-year struggle to harness quantum weirdness for computing, physicists finally have their goal in reach.
But then the reader’s mind is again poisoned with this nonsense:
Where a classical computer has to try each combination in turn, a quantum computer could process all those combinations simultaneously — in effect, carrying out calculations on every possible set of input data in parallel.
Part of the problem is that there exist no other easy concepts that a news author can quickly turn to when trying to offer up an explanation that a casual reader can understand, while at the same time having his mind blown. (‘Wow, every possible combination at the same time!’ It’s like double rainbow all over again).
Here’s my attempt to remedy this situation, a simple example to illustrate the extended capabilities of quantum computing versus classical machines. The latter are very fast, but when solving a complex puzzle, i.e. finding the lowest number in an unordered list, they have to take one stab at it at a time. It is like attacking an abstract problem-space the way ancient mariners had to fathom the depth of the sea. (Gauging the depth with a rope in this manner is the original meaning of the word ‘fathom’).
You may argue that having several guys fathoming at the same time will give you a ‘parallelizing’ speed-up, but you would have to be a Luddite to the core to convince yourself that this could ever measure up to echolocation. Just like the latter can perceive data from a larger patch of seafloor, quantum computing can leverage more than just local point data. But this comes at a price: The signal that comes back is not easy to interpret. It depends on the original set-up of the probing signal, and requires subsequent processing.
Like an echolocation system, a quantum computer doesn’t magically probe the entire configuration space. It ‘sees’ more, but it doesn’t provide this information in an immediately useful format.
The real challenge is to construct the process in a way that allows you to actually get the answer to the computational problem you are trying to solve. This is devilishly difficult, which is why there are so few quantum algorithms in existence. There are no simple rules to follow. In order to create one, it requires first and foremost inspiration, and is as much art as science. That is why, when I learned how Shor’s algorithm worked, I was profoundly astounded and awed by the inordinate creativity it must have taken to think up.
Regardless, if this was the only problem with Elizabeth Gibney’s article, that would just be par for the course. Yet, while reporting on Google’s efforts to build their own quantum computing chip, she manages to not even mention the other quantum computer Google is involved with, and that despite D-Wave publishing in Nature in 2011 and just last year in Nature Communications.
Maybe if she hadn’t completely ignored D-Wave, she may have thought to ask Martinis the most pressing question of all: What kind of chip will he build for Google? Everything indicates that it is yet another quantum annealer, but the quotes in the article make it sound as if he was talking about gate computing:
“It is still possible that nature just won’t allow it to work, but I think we have a decent chance.”
Obviously he can not possibly be referring to quantum annealing in this context, since that clearly works just fine with fairly large numbers of qubits (as shown in the above mentioned Nature publication).
The current state of news reporting on quantum computing is beyond frustrating. There is a very real and fascinating race underway for the realization of the first commercially useful universal quantum computer. Will it be adiabatic or the gate model? Are quantum cellular automatons still in the running?
But of course in order to report on this, you must first know about these differences. Apparently, when it comes to science news reporting, this is just too much to expect.
The Nature article also contains this little piece of information:
… the best quantum computers in the world are barely able to do school-level problems such as finding the prime factors of the number 21. (Answer: 3 and 7.)
I guess the fact that the answer is provided gives us a hint as to what level of sophistication the author expects from her audience, which in turn must be terribly confused to see a headline such as “New largest number factored on a quantum device is 56,153“.
This is of course not done with Shor’s algorithm but via adiabatic computing (and also involves some slight of hand as the algorithm only works for a certain class of numbers and not all integers).
Nevertheless, adiabatic computing seems to have the upper hand when it comes to scaling the problem scope with a limited number of qubits. But the gate model also made some major news last month. The guinea pig Simon’s algorithm (one of the first you will learn when being introduced to the field) has been demonstrated to provide the theoretically predicted quantum speed-up. This is huge news that was immediately translated to the rather misleading headline “Simon’s algorithm run on quantum computer for the first time—faster than on standard computer“.
Faster in this case means less processing iterations rather than actual elapsed time, but irrespective, having this theoretical prediction confirmed using the fairly recent one-way technique clearly bolsters the case that gate computing can deliver the goods.
No doubt, the race between the architectures to deliver the first commercial-grade universal quantum computer is on. It is still wide open, and makes for a compelling story. Now, if we could only get somebody to properly report on it.
Your article was about 10^infinity times better than the Nature article.
May have to print that one out and hang it on the wall 🙂
Topical: “Sufficiently advanced incompetence is indistinguishable from malice” — Grey’s Law.
Perhaps you have to be that reporter Henning…
Appreciate the sentiment, but I wish that these journalists were paying more attention. Through networking I get to talk to plenty of researchers in the field, but I am pretty sure nobody will return a cold call I make. If on the other hand you can say that you write for Nature I am pretty sure that’ll make a bit of a difference.
Geordie,
Is D-Wave and the Martinis’ Group sharing knowledge and working together on a daily basis? No need to disclose the subject matter of course, but whether or not you are really working together would be a relatively easy and discrete way to indicate what approach they may be taking. 🙂 Thanks.
Hi Henning: Here is another feather in D-Wave’s cap!. Thanks:
http://www.forbes.com/sites/alexknapp/2015/01/29/quantum-computing-company-d-wave-raises-29-million/
Wow! Another round of funding, that really tells something. I think the critics are the ones having doubts now! To me all the support and developments at D-wave goes to show that they are in the right track in taking the engineering approach to make quantum computing a reality! I don’t know why some scientists just don’t get that.
By the way Henning, do you have a separate page for sending you new tips, not sure if it’s ok to post this way 🙂
Ramsey, no worries you can just post them to the last open comment thread. (BTW since I heard this news from various sides, just didn’t find time yet to blog, I will forgo my usual h/t attribution on that one).
Hi Henning: Off-topic! Here is a new article about those “primordial gravitational waves” that suggests that the authors goofed!: http://www.newscientist.com/article/dn26883-leak-suggests-big-bang-find-was-a-dusty-mistake.html