Monthly Archives: January 2013

Quantum Computing Hype Cycle and a Bet of my Own

box in troubleThe year 2013 started turbulent for the quantum computing field with a valiant effort by long time skeptic and distinguished experimentalist Michel  I. Dyakonov  to relegate it to the status of a pathological science akin to cold fusion (he does not use the term in his paper but later stated: “The terms ‘failed, pathological’ are not mine, but the general sense is correct.”).

Scott Aaranson took on this paper in his unique style (it’s a long read but well worth it). There really isn’t much to add to his arguments, but there is another angle that intrigues my inner “armchair psychologist”:  What exactly is it about this field that so provokes some physicists?  Is it that …

  • … Computer Scientists of all people are committing Quantum Mechanics?
  • … these pesky IT nerds have the audacity to actually take the axioms of Quantum Mechanics so seriously as to regard them as a resource for computational engineering?
  • … this rabble band of academics are breeding papers at a rabbit’s pace, so that no one can possibly keep up and read them all?
  • … quantum information science turned the ERP paradox on its head and transformed it into something potentially very useful?
  • … this novel science sucks up all sorts of grant money?

The answer is probably all of the above, to some extent.  But this still doesn’t feel quite right.  It seems to me the animosity goes deeper.  Fortunately, Kingsley Jones (whom I greatly admire) blogged about similar sentiments, but he is much more clear eyed on what is causing them.

It seems to me that the crux of this discomfort stems from the fact that many physicists have a long harbored discomfort with Quantum Mechanic’s intractabilities, which were plastered over with the Copenhagen Interpretation (which caused all sorts of unintended side effects).  It’s really a misnomer, it should have been called the ostrich interpretation, as its mantra was to ignore the inconstancies and to just shut-up and calculate. It is the distinct merit of Quantum Information science to have dragged this skeleton out of the closet and made it dance.

The quantum information scientists are agnostic on the various interpretations, and even joke about it.  Obviously, if you believe there is a truth to be found, there can be only one, but you first need to acknowledge the cognitive dissonance if there’s to be any chance of making progress on this front. (My favorite QM interpretation has been suggested by Ulrich Mohrhoff, and I have yet to find the inspiration to blog about this in an manner that does it justice – ironically, where he thinks of it as an endpoint, I regard it as allowing for a fresh start).

Meanwhile, in the here and now, the first commercial quantum computing device the D‑Wave One has to overcome its own challenges (or being relegated to a computing curiosity akin to analog neural VLSI).  2013 will be the year to prove its merits in comparison to conventional hardware. I’ve been in touch with a distinguished academic in the field (not Scott A.) who is convinced that optimization on a single conventional CPU will always outperform the D-Wave machines – even on the next generation chip. So I proposed a bet, albeit not a monetary one: I will gladly ship a gallon of Maple sirup to him if he is proven right and our dark horse Canadian trail blazer don’t make the finish line. The results should be unambiguous and will be based on published research, but just in case, if there should be any disagreement we will settle on Scott Aaronson as a mutually acceptable arbiter.  Scott is blissfully unaware of this, but as he is also the betting kind (the really big ones), I hope he’d be so kind as to help us sort this out if need be. After all, I figure, he will be following the D-Wave performance tests and at that time will already have formed an informed opinion on the matter.

The year 2013 started off with plenty of QIS drama and may very well turn out to be the crucial one to determine whether the field has crossed the rubicon.   It’s going to be a fun one.

Peak Copper – No More

Screen Shot 2013-01-14 at 9.36.25 PM
An LED lamp suspended from the hair thin nanotube wires that also supply it with power.

This has been reported all over the Web, but it is just too good to pass up, especially since the year in the Quantum Computing world started on a somewhat more contentious note (more about this in the next blog post).

This news item on the other hand deserves to be the first in the new year and is entirely positive:  I was expecting carbon nanotubes to eventually become the material of choice for electric wiring but I didn’t expect it to happen this soon.  The video that is embedded below, makes the compelling case that a research team at Rice university not only managed to produce wires superior to any metal wire, but also at the same time, to develop a production process that can be readily scaled up.


Copper price development over the last ten years.

Being able to produce these kind of wires at a competitive price will go a long way to ameliorate one of humanity’s key resource problems: Peak copper (a term coined after the more publicized peak oil prognosis). And this resource constraint is anything but theoretical.  Copper prices have so increased over the last ten years that copper theft became a serious global problem, one that often endangers lives when critical infrastructure is destroyed.

These new copper nanotube wires have the potential to substitute copper wiring in cars, airplanes, microchip as well as residential wiring to just name a few.  If the wires are as good as they are made to look in the video, they will be superior to copper wires to such an extend, that it will be simply a matter of price for them to be adopted.

This is the kind of science news I like to hear at the beginning of a new year.