Networkworld.com has an instructive article on the state of quantum computing. While the headline is somewhat sensationalist (not as much as mine though) the article turned out to be well written and solidly sourced. It does a good job of highlighting what D-Wave has on offer while contrasting it to the universal quantum computing approach.
Dr. Suzanne Gildert from D-Wave is quoted extensively and I think succeeded in conveying why their product is relevant. It is also instructive that she envisions quantum computing as a kind of co-processing unit for cloud service providers. This very much meshes with my own expectations of where we will see the first wave of the commercial application of quantum computers (no pun intended).
Where the article appears somewhat disjointed, is in not making clear that the presented argument against the rapid availability of quantum computing hardly applies to the approach that D-Wave is taking. Another testament to the fact that the waters have been considerably muddied as to what quantum computing actually means. Maybe D-Wave is better served to consistently call its method “Natural Quantum Computing ” as positioned by Dr. Gildert.
The gist of all pessimism voiced in the article is the ever-present struggle to preserve the coherence of qbits. While this is the obvious Achilles’ heel of any quantum system the critical voices quoted in the article seem oddly out of step with the current state of the art. The relatively new approach of cluster-state quantum computation significantly mitigates the decoherence problem. When it was first published in 2005 it prompted David Deutsch to shorten his expectation of the realization of the first universal quantum computer from many decades to within the decade.
These recent research trends in cluster/measure based quantum computation were nowhere reflected in the article whereas ultimate naysayer Artur Ekert from Oxford university was quoted as gleefully stating that “the best outcome of our research in this field would be to discover that we cannot build a quantum computer for some very fundamental reason, then maybe we would learn something new and something profound about the laws of nature.”
Makes you wonder if Ekert and his fellow faculty member Deutsch who co-authored a couple of older papers are still on the same page – or on speaking terms for that matter.
Update: I should have realized that IT journalism subscribes to the same “he said – she said” sort of reporting that makes mainstream media so useless in the US. Ekert’s quote is probably taken out of context just to provide some semblance of faux balance in Networkworld’s article.
I don’t think Ekert is a naysayer, nor do I think he expects quantum mechanics to be refuted.
I think he believes the consensus view within the quantum information community, which is that the fact that quantm computing is possible in principle is implied by standard non-relativistic quantum mechanics that we have known since the 1930s. Therefore, (a) it should be pretty noncontroversial that quantum computers are possible in principle even if they are very hard to engineer in practice, and (b) if that view is wrong, then it would overturn huge amounts of physics that we have assumed for nearly a century, and so would be a revolutionary scientific breakthrough.
In this sense (b) is the “best outcome” but he is probably saying that without believing that (b) is very likely. For that reason, I wouldn’t call him a naysayer.
I don’t claim to speak for Ekert specifically, but the quoted line is a very common one in our field (almost a cliche), which is why I feel I can speculate about what he meant by it.
Good point, should know better than to trust lazy journalism. Will update the story accordingly.