Monthly Archives: September 2011

Nascent industry killed from birthing pain? has an instructive article on the state of quantum computing. While the headline is somewhat sensationalist (not as much as mine though) the article turned out to be well written and solidly sourced. It does a good job of highlighting what D-Wave has on offer while contrasting it to the universal quantum computing approach.

Dr. Suzanne Gildert from D-Wave is quoted extensively and I think succeeded in conveying why their product is relevant. It is also instructive that she envisions quantum computing as a kind of co-processing unit for cloud service providers. This very much meshes with my own expectations of where we will see the first wave of the commercial application of quantum computers (no pun intended).

Where the article appears somewhat disjointed, is in not making clear that the presented argument against the rapid availability of quantum computing hardly applies to the approach that D-Wave is taking. Another testament to the fact that the waters have been considerably muddied as to what quantum computing actually means. Maybe D-Wave is better served to consistently call its method “Natural Quantum Computing ” as positioned by Dr. Gildert.

The gist of all pessimism voiced in the article is the ever-present struggle to preserve the coherence of qbits. While this is the obvious Achilles' heel of any quantum system the critical voices quoted in the article seem oddly out of step with the current state of the art. The relatively new approach of cluster-state quantum computation significantly mitigates the decoherence problem. When it was first published in 2005 it prompted David Deutsch to shorten his expectation of the realization of the first universal quantum computer from many decades to within the decade.

These recent research trends in cluster/measure based quantum computation were nowhere reflected in the article whereas ultimate naysayer Artur Ekert from Oxford university was quoted as gleefully stating that “the best outcome of our research in this field would be to discover that we cannot build a quantum computer for some very fundamental reason, then maybe we would learn something new and something profound about the laws of nature.”

Makes you wonder if Ekert and his fellow faculty member Deutsch who co-authored a couple of older papers are still on the same page - or on speaking terms for that matter.

Update: I should have realized that IT journalism subscribes to the same "he said - she said" sort of reporting that makes mainstream media so useless in the US. Ekert's quote is probably taken out of context just to provide some semblance of faux balance in Networkworld's article.

You keep using that word. I do not think it means what you think it means.

UCSB physicists claim to have a von Neumann architecture implemented for quantum computing.

Far be it from me to denigrate what they achieved but I have to take issue with the marketing.

They keep using that "von Neumann" word. I do not think it means what they think it means. In a strict sense a quantum computer cannot possibly implement a von Neumann architecture. Peter Hines a theoretical computer science researcher wrote a nice paper about just that question not too long ago.

I think this an example of a somewhat careless appropriation of a term that has a well-defined meaning in computer science.

In my mind this is rather problematic, because sloppy adoption of classical CS terms does nothing to illuminate the fundamental difference of quantum computing and is destined to just add confusion.


The UCSB researchers talk about a "Quantum von Neumann" architecture in their original paper. The "Quantum" unfortunately got dropped in many news releases.