Septimana Mirabilis – Major Quantum Information Technology Breakthroughs

Update 4: The award for the funniest photo commentary on this imbroglio goes to Robert Tucci.

Update 3: Congratulations to D-Wave for their  recent sale of the D-Wave Two machine to  the non-profit Space Research Association  – to be used collaboratively by Google and NASA. (h/t Geordie Rose)

Update 2: Scott Aaronson finally weighs in, and as Robert Tucci predicted in the comments, he resumed his sceptical stance.

Update: Link to Catherine McGeoch and Cong Wang’s paper.

D-Wave Cooper-pair states in real space. The company that derived it's name from this now makes some major waves of its own.
D-Wave Cooper-pair states in real space. Now the company that derived its name from this wavefunction makes some waves of its own.

What a week for Quantum Information Science. D-Wave made some major news when the first peer reviewed paper to conclusively demonstrate that their machine can drastically outperform conventional hardware was recently announced.  It’s hardly a contest.  For the class of optimization problems that the D-Wave machines are designed for, the algorithms executed on the conventional chip didn’t even come close. The D-Wave machine solved some of the tested problems about 3600 times faster than the best conventional algorithm. (I’ll leave it to gizmodo to not mince words).

Apparently, my back of the envelope calculation from last year, that was based on the D-Wave One performance of a brute force calculation of Ramsey numbers, wasn’t completely off.  Back then I calculated that the 128 qubit chip performed at the level of about 300 Intel i7 Hex CPU cores (the current test ran on the next generation 512 qubit chip). So, I am now quite confident in my ongoing bet.

If conventional hardware requires thousands of conventional cores to beat the current D-Wave machine, then the company has certainly entered a stage where its offering becomes attractive to a wider market.  Of course, other factors will weigh in when considering total cost of ownership.  The biggest hurdle in this regard will be software, as to date any problem you want to tackle the D-Wave way requires dedicated coding for this machine.  At first these skills will be rare and expansive to procure. On the other hand, there are other cost factors working in D-Wave’s favor:  Although I haven’t seen power consumption numbers, the adiabatic nature of the chip’s architecture suggests that it will require far less wattage than a massive server farm or conventional super-computer.  Ironically, while the latter operate at normal ambient temperature they will always require far more cooling effort to keep them at this temp than the D-Wave chips in their deep freeze vacuum.

That the current trajectory of our supercomputer power consumption is on an unsustainable path should be obvious by simply glancing at this chart.

Despite the efforts there are hard efficiency limits for conventional CMOS transistors. (for the original pingdom.com article click image)

D-Wave matures just at the right time to drive a paradigm change, and I hope they will pursue this opportunity aggressively.

But wait, there’s more.  This week was remarkable in unveiling yet another major breakthrough for Quantum Information technology: At Los Alamos National Labs, an Internet scalable quantum cryptographic network has been operating without a hitch for the last two years.  Now there’s an example for research that will “directly benefit the American people” (something that should please Congressman Lamar Smith, the current controversial chairman of the House of Representatives Committee on Science).

Why it took two years for this news to be published is anybody’s guess. Did somebody just flip a switch and then forget about it? Probably more likely that this research has been considered classified for some time.

Certainly this also suggests a technology who’s time has come.  Governmental and enterprise networks have been compromised at increasing rates, even causing inflammatory talk of ongoing cyber warfare. And while there have been commercial quantum encryption devices on the market for quite some time now, these have been limited to point to point connections.  Having a protocol that allows the seamless integration of quantum cryptography into the existing network stack raises this to an entirely different level.  This is of course no panacea against security breaches, and has been criticized as providing superfluous security illusions, since the social engineering attacks clearly demonstrate the human users as the weakest link. Nevertheless, I maintain that it has the potential to relegate brute force attacks to history’s dustbin.

The new quantum protocol uses a typical “hub-and-spoke” topology as illustrated in the following figure and explained in more detail in the original paper.

Network-Centric Quantum Communications with Application to Critical  Infrastructure Protection Topology
The NQC topology maps well onto those widely encountered in optical fiber networks, and permits a hierarchical trust architecture for a “hub” to act as the trusted authority (TA, “Trent”) to facilitate quantum authenticated key exchange.

Another key aspect is the quantum resource employed in the network:

The photonic phase-based qubits typically used in optical fiber QC require interferometric stability and inevitably necessitate bulky and expensive hardware. Instead, for NQC we use polarization qubits, allowing the QC transmitters – referred to as QKarDs – to be miniaturized and fabricated using integrated photonics methods [12]. This opens the door to a manufacturing process with its attendant economy of scale, and ultimately much lower-cost QC hardware.

It will be interesting to observe how quickly this technology will be commercialized, and if the US export restriction on strong cryptography will hamper the global adoption.

14 thoughts on “Septimana Mirabilis – Major Quantum Information Technology Breakthroughs

  1. I don’t think Scottie is too happy about the news. If there were justice…or at least more good taste like his…in the world, all that CIA money would be going to him and his friends, instead of being funneled into D-wave.
    Number of times Scott uses the term “D-wave” in his book “Quantum computing since Democritus, but skipping D-wave”
    Zero

    1. Having presumably achieved his career goal as a tenured MIT prof, and being a freshly minted father, I don’t think Scott’s happiness will be much of a function of D-Wave’s performance.

      The expert I have my bet with still sounds rather confident, so I expect his hand-optimized algorithms may be doing better than what has been reported so far. At any rate, certainly Scott will eventually weigh in on his blog (Scott doesn’t know this but we count on him to be the arbitrer of our bet if there is some disagreement, hopefully he’ll humor us if it’ll come to that).

      Not mentioning D-Wave in his book is odd, the controversy alone makes for a good story. Since he has come around it seems to me he has been fairly agnostic. Maybe something to stick into a second edition, once all the dust settled?

      1. Henning, one reason why D-Wave doesn’t appear in my book is simple: because I gave the lectures that turned into the book in 2006, before D-Wave was on anyone’s radar! Having said that, when revising and updating the manuscript last year, I did toy with the idea of adding something about D-Wave. But I decided against, because it just wasn’t relevant to anything I was talking about there. The book is about how quantum computing fits into the broader quest to understand the limits of the knowable, and how quantum mechanics changes (and doesn’t change) notions of computation, proof, etc. that one might have considered a priori. As such, the book doesn’t even really talk about “standard” QC implementation proposals (ion traps, etc), or even about the standard quantum algorithms and how they work, let alone about D-Wave! There are dozens of other books that do discuss those topics, which is wonderful for me, because it meant I felt freer to go off in a different direction.

        1. Not worries, I enjoy the book immensely with or without that chapter. Didn’t even think about this until Robert brought it up. It’s certainly a story that needs to be told at some point, but as it is still playing out there’s no hurry.

    2. What I said didn’t convey my meaning too clearly. I should have said:

      I don’t think Scottie is too happy about the news. HE MUST BE THINKING: If there were justice…or at least more good taste like his…in the world, all that CIA money would be going to him and his friends, instead of being funneled into D-wave.
      Number of times Scott uses the term “D-wave” in his book “Quantum computing since Democritus, but skipping D-wave”
      Zero

      1. Theoretical computer science is much cheaper than building hardware, all it takes is a dude and a pencil.

  2. I think it would be a mistake to take Scott as the ultimate arbiter of good taste in quantum computation. He was already pretty wrong about D-wave

    1. Well, he also showed his capacity to reconsider. Anyhow this won’t be about taste but fairly interpreting the rules that we set out (probably won’t be necessary anyway – either he has an algorithm that beats the D-Wave machine or he doesn’t).

  3. I’m a big fan of d-wave, don’t get me wrong, but it’s not clear to me that this benchmarking is particularly relevant one way or the other.

    I’m not saying d-wave isn’t the next leap forward, but this particularly publication does not prove (or disprove) anything, except that chip A (which is designed to solve this problem specifically) is faster than chip B (which is designed to solve any type of problem).

    What would be interesting would be for someone to design ASIC chips (not just GPU) which are also utilized alongside carefully optimized software specifically to tackle this subset of problems. Only once d-wave can prove that they can ‘out-scale’ that system (both in terms of silicon and power) can we assume they’ve done something concretely relevant.

    Apparently Matthias Troyer has been working in this space and has been giving talks. Scott will blog about it soon. I’m definitely waiting on his results.

    Let me re-iterate though, I’m very hopeful that Geordie and Friends have done it. But we all need to tamper our optimism a little bit. Getting ahead of ourselves just leads to providing fuel to those who want to bring the team down.

    Also, I wanted to bring up a small point. When looking at the arvix paper studying the 128 qubit machine, it concluded that Dwave hasn’t built anything that fundamentally alters the algorithmic complexity, but rather just reduces the sizes of the constants.

    http://arxiv.org/pdf/1304.4595v1.pdf

    “While quantum mechanics is not expected to turn the exponential scaling into a polynomial one, the constants c and a can be smaller on quantum devices, potentially giving substantial speedup over classical algorithms.”

    Has dwave claimed otherwise? If all they have done is reduced the constants, then I can see where a carefully optimized system might be able to challenge what they’re doing, even if doesn’t involve quantum behavior.

    1. all excellent points by blazespinnaker. D-wave still has a lot to prove (like any newfangled company). And gate level QCs may still outshine D-wave in the future. But for now, D-wave is a cool, unique device and we better learn as much physics from it as we can.

  4. That the current trajectory of our supercomputer power consumption is on an unsustainable path should be obvious by simply glancing at this chart.

    I disagree. Your data shows that power of an arbitrarily large machine roughly doubled in one decade. Budgets have presumably at least doubled for supercomputing resources in the same period. Performance also increased by ~1,000x! Only when you force the data to fit a quadratic curve does it become “unsustainable”. Also, I don’t believe the #1 supercomputer ever used 0 kW.

    1. Yes, the 0 kW is silly (but hey it ain’t my chart), maybe there was an unknown super steam-punk Babbage machine released in 2004 that we are not aware off?

      The unsustainable aspect I try to stress (and apparently didn’t do a very good job with) refers to the energy consumption. Yes, the improvements are impressive as ever when it comes to things governed by Moor’s law, but leak currents are not going anywhere, and in these days of big data, the hunger for processing power easily outpaces what conventional super-computing can deliver with a reasonable CO2 footprint.

Comments are closed.