Update 4: The award for the funniest photo commentary on this imbroglio goes to Robert Tucci.
Update: Link to Catherine McGeoch and Cong Wang’s paper.
What a week for Quantum Information Science. D-Wave made some major news when the first peer reviewed paper to conclusively demonstrate that their machine can drastically outperform conventional hardware was recently announced. It’s hardly a contest. For the class of optimization problems that the D-Wave machines are designed for, the algorithms executed on the conventional chip didn’t even come close. The D-Wave machine solved some of the tested problems about 3600 times faster than the best conventional algorithm. (I’ll leave it to gizmodo to not mince words).
Apparently, my back of the envelope calculation from last year, that was based on the D-Wave One performance of a brute force calculation of Ramsey numbers, wasn’t completely off. Back then I calculated that the 128 qubit chip performed at the level of about 300 Intel i7 Hex CPU cores (the current test ran on the next generation 512 qubit chip). So, I am now quite confident in my ongoing bet.
If conventional hardware requires thousands of conventional cores to beat the current D-Wave machine, then the company has certainly entered a stage where its offering becomes attractive to a wider market. Of course, other factors will weigh in when considering total cost of ownership. The biggest hurdle in this regard will be software, as to date any problem you want to tackle the D-Wave way requires dedicated coding for this machine. At first these skills will be rare and expansive to procure. On the other hand, there are other cost factors working in D-Wave’s favor: Although I haven’t seen power consumption numbers, the adiabatic nature of the chip’s architecture suggests that it will require far less wattage than a massive server farm or conventional super-computer. Ironically, while the latter operate at normal ambient temperature they will always require far more cooling effort to keep them at this temp than the D-Wave chips in their deep freeze vacuum.
That the current trajectory of our supercomputer power consumption is on an unsustainable path should be obvious by simply glancing at this chart.
D-Wave matures just at the right time to drive a paradigm change, and I hope they will pursue this opportunity aggressively.
But wait, there’s more. This week was remarkable in unveiling yet another major breakthrough for Quantum Information technology: At Los Alamos National Labs, an Internet scalable quantum cryptographic network has been operating without a hitch for the last two years. Now there’s an example for research that will “directly benefit the American people” (something that should please Congressman Lamar Smith, the current controversial chairman of the House of Representatives Committee on Science).
Why it took two years for this news to be published is anybody’s guess. Did somebody just flip a switch and then forget about it? Probably more likely that this research has been considered classified for some time.
Certainly this also suggests a technology who’s time has come. Governmental and enterprise networks have been compromised at increasing rates, even causing inflammatory talk of ongoing cyber warfare. And while there have been commercial quantum encryption devices on the market for quite some time now, these have been limited to point to point connections. Having a protocol that allows the seamless integration of quantum cryptography into the existing network stack raises this to an entirely different level. This is of course no panacea against security breaches, and has been criticized as providing superfluous security illusions, since the social engineering attacks clearly demonstrate the human users as the weakest link. Nevertheless, I maintain that it has the potential to relegate brute force attacks to history’s dustbin.
The new quantum protocol uses a typical “hub-and-spoke” topology as illustrated in the following figure and explained in more detail in the original paper.
Another key aspect is the quantum resource employed in the network:
The photonic phase-based qubits typically used in optical fiber QC require interferometric stability and inevitably necessitate bulky and expensive hardware. Instead, for NQC we use polarization qubits, allowing the QC transmitters – referred to as QKarDs – to be miniaturized and fabricated using integrated photonics methods . This opens the door to a manufacturing process with its attendant economy of scale, and ultimately much lower-cost QC hardware.
It will be interesting to observe how quickly this technology will be commercialized, and if the US export restriction on strong cryptography will hamper the global adoption.