Update: Thanks to everybody who keeps pointing me to relevant news (Ramsey, Rolf, Sol and everybody else my overtired brain may not recall at this time).
There is no doubt that D-Wave is on a role:
- They entered another multi-year agreement with Lockheed Martin and upgraded them to the new 2X system.
- The company signed the largest contract yet, to partner with Google, NASA and the Universities Space Research Association over the span of seven years.
- The Los Alamos national laboratory is picking up a D-Wave machine.
And then there’s the countdown to what is billed as a D-Wave related watershed announcement from Google coming Dec 8th. Could this be an early Christmas present to D-Wave investors?
~~~~~~
Back in the day before he re-resigned as D-Wave’s chief critic, Scott Aaronson made a well-reasoned argument as to why he thought this academic, and at times vitriolic, scrutiny was warranted. He argued that a failure of D-Wave to deliver a quantum speed-up would set the field back, similar to the AI winter that was triggered by Marvin Minsky’s Perceptrons book.
Fortunately, quantum annealers are not perceptrons. For the latter, it can be rigorously proven that single layer perceptrons are not very useful. Ironically, at the time the book was published, multilayered perceptrons, i.e. a concept that is now fundamental to all deep learning algorithms, were already known, but in the ensuing backlash research funding for those also dried up completely. The term “perceptron” became toxic and is now completely extinct.
Could D-Wave be derailed by a proof that shows that quantum annealing could, under no circumstances, deliver a quantum speed-up? To me this seems very unlikely, not only because I expect that no such proof exists, but also because, even if this was the case, there will still be a practical speed-up to be had. If D-Wave manages to double their integration density at the same rapid clip as in the past, then their machines will eventually outperform any classical computing technology in terms of annealing performance. This article (h/t Sol) expands on this point.
So far there is no sign that D-Wave will slow its manic pace. The company recently released its latest chip generation, featuring quantum annealing with an impressive 1000+ qubits (in practice, the number will be smaller, as qubits will be consumed for problem encoding and software EEC). This was followed with a detailed test under the leadership of Catherine McGeoch, and it will be interesting to see what Daniel Lidar, and other researchers with access to D‑Wave machines, will find.
My expectation has been from the get-go that D-Wave will accelerate the development of this emerging industry, and attract more money to the field. It seems to me that this is now playing out.
Intel recently (and finally as Robert Tucci points out) entered the fray with a $50M investment. While this is peanuts for a company of Intel’s size, it’s an acknowledgement that they can’t leave the hardware game to Google, IBM or start-ups such as Rigetti.
On the software side, there’s a cottage industry of software start-ups hitching their wagons to the D-Wave engine. Many of these are still in stealth mode, or early stage such as QC Ware, while others already start to receive some well deserved attention.
- QxBranch has now a presence in Hong Kong, in addition to Australia and their Washington, DC HQ, clearly on the path to world domination.
- 1QBit was recently named 2015 Tech Pioneer by the World Economic Forum (kudos to Andrew, Landon and their team!).
Then there are also smaller vendors of established software and services that already have a sophisticated understanding of the need to be quantum ready. The latter is something I expect to see much more in the coming years as the QC hardware race heats up.
The latest big name entry into the quantum computing arena was Alibaba, but at this time it is not clear what this Chinese initiative will focus on. Microsoft, on the other hand, seems to be a known quantity and will not get aboard the D‑Wave train, but will focus exclusively on quantum gate computing.
Other start-ups, like our artiste-qb.net, straddle the various QC hardware approaches. In our case, this comes “out-of-the-box”, because our core technology, Quantum Bayesian Networks, as developed by Robert Tucci, is an ideal tool to abstract from the underlying architecture. Another start-up that is similarly architecture agnostic is Cambridge QC. The recent news of this company brings to mind that sometimes reality rather quickly imitates satire. While short of the $1B seed round of this April Fool’s spoof, the influx of $50M dollars from the Chile based Grupo Arcano is an enormous amount for a QC software firm, that as far as I know, holds no patents.
Some astoundingly big bets are now being placed in this field.