The Business Case for D-Wave

A tried and tested success formula for lazy journalism is the build-up and tear-down pattern.

The hype comes before the fall. In the context of information technology, Gartner copyrighted the aptly named term “hype cycle”. Every technology starts out in obscurity, but some take off stellarly, promising the moon but, with the notable exception of the Apollo program, falling a bit short of that. Subsequently, disillusionment sets in, sometimes going as far as vilification of the technology/product. Eventually sentiments hit rock bottom, and a more balanced and realistic view is adopted as the technology is mainstreamed.

Even Web technology followed this pattern to some extent, and this was clearly mirrored by the dot com stock bubble. At the height of the exuberance, the web was credited with ushering in a New Economy that would unleash unprecedented productivity growth. By now it has, of course, vastly improved our lives and economies, it just didn’t happen quite as rapidly and profitably as the market originally anticipated.

D‑Wave’s technology will inevitably be subjected to the same roller coaster ride. When you make it to the cover of Time magazine, the spring for the tear down reaction has been set, waiting for the trigger. The latter came in the form of the testing performed by Mathias Troyer et al. While all of this is as expected in the field of journalism, I was a bit surprised to see that one of the sites I link to in my blogroll followed this pattern as well. When D‑Wave was widely shunned by academia, Robert R.Tucci wrote quite positively about them, but now seems to have given up all hope in their architecture in reaction to this one single paper. He makes the typical case against investing into D-Wave that I've seen many times argued by academics vested in the gate model of quantum computing.

The field came to prominence due to the theoretically clearly established potential of gate based quantum computers to outperform classical machines. And it was Shor’s algorithm that captured the public’s imaginations (and the NSA’s attention). Widely considered to be a NP-intermediate problem Shor’s algorithm could clearly crack our encryption schemes if we had gate based QC with thousands of qubits. Unfortunately, this is still sci-fi, and so the best that has been accomplished so far was the factorization of 21 based on this architecture. The quantum speed-up would be there if we had the hardware, but alas at this point it is the embodiment of something that is purely academic with no practical relevance whatsoever.

There is little doubt in my mind that a useful gate based quantum computer will be eventually built, just like, for instance, a space elevator. In both cases it is not a matter of ‘if’ but just a matter of ‘when’.

I’d wager we won’t see either within the next ten years.

Incidentally, it has been reported that a space elevator was considered as a Google Lab’s project, but subsequently thrown out as it requires too many fundamental technological breakthroughs in order to make it a reality. On the other hand, Google snapped up a D-Wave machine.

So is this just a case of acquiring trophy hardware, as some critics on Scott’s blog contended? I.e. nothing more than a marketing gimmick? Or have they been snookered? Maybe they, too, have a gambling addiction problem, like D-Wave investors as imagined on the qbnets blog?

Of course none of this is the case. It just makes business sense. And this is readily apparent as soon as you let go of the obsession over quantum speed-up.

Let’s just imagine for a second that there was a company with a computing technology that was neither transistor nor semiconductor based. Let’s further assume that within ten years they managed to rapidly mature this technology so that it caught up to current CPUs in terms of raw performance, and that this was all done with chip structures that are magnitudes larger than what current conventional hardware needs to deploy. Also this new technology does not suffer from loss currents introduced via accidental quantum tunneling, but is actually designed around this effect and utilizes it. Imagine that they did all this with a fraction of the R&D sums spend on conventional chip architectures, and since the power consumption scaling is radically different from current computers, putting another chip into the box will hardly double the energy consumed by the system.

A technology like this would almost be like the kind that IBM just announced to focus their research on, trying to find a way to the post-silicon future.

So our 'hypothetical company' sounds pretty impressive, doesn't it? You’d think that a company like Google that has enormous computational needs would be very interested in test driving an early prototype of such a technology. And since all of the above applies to D‑Wave this is indeed exactly what Google is doing.

Quantum speed-up is simply an added bonus. To thrive, D‑Wave only needs to provide a practical performance advantage per KWh. The $10M up-front cost, on the other hand, is a non-issue. The machines are currently assembled like cars before the advent of the Ford Model T. Most of the effort goes into the cooling apparatus and interface with the chip, and there clearly will be plenty of opportunity to bring down manufacturing cost once production is revved up.

The chip itself can be mass-produced using adapted and refined Lithographic processes borrowed from the semi-conductor industry; hence the cost basis for a wafer of D‑Wave chips will not be that different from the chip in your Laptop.

Just recently, D‑Wave’s CEO mentioned an IPO for the first time in a public talk (h/t Rolf D). Chances are, the early D-Wave investors will be laughing at the naysayers all the way to the bank long before a gate based quantum computer factors 42.

moon

A book I have to regularly read to our three year old Luna. So far she refrained from also requesting a gate based quantum computer.

Posted in D-Wave, Quantum Computing | 3 Comments

Fusion and Other News – Memory Hole Rescue

Another post on D-Wave is in my blog queue, but with all this attention on quantum computing my other favorite BC based high tech start-up doesn't get enough of my time - I haven't written anything on energy and fusion for quite a while, despite some dramatic recent news (h/t Theo) with regards to another dark horse fusion contender.

Fortunately, there is another excellent blog out there which is solely focused on fusion technology and the various concepts in the field. The Polywell is covered in depth, but General Fusion also gets is due, for its innovative technology.

Another focus of mine, the trouble with contemporary theoretical physics also keeps falling through the cracks.  From my past posts one may get the impression that I am just yet another String apostate, but I don't really have any trouble with String Theory as such, but rather with uncritical confirmation bias. Unfortunately, the latter cuts across all fields as nicely demonstrates in this recent post of hers.

Posted in Popular Science | Tagged , , | 3 Comments

Just in Case You Want to Simulate an Ising Spin Glass Model

This is just a short update to include a link to the solver code developed by Matthias Troyer et al. that I discussed in my last blog post.

The advantage of conventional open source code is that everybody can play with it. To the extent that the code faithfully emulates quantum annealing of an Ising spin model, this will be a net benefit to D-Wave as it allows programmers to perform early testing of algorithmic concepts for their machine (similar to the emulator that is part of D-Wave's evolving coding environment).

Matthias Troyer attached the source code to this earlier paper, and for easier download I put it into this zip file. As I am hard pressed for time these days, I didn't get to work with it yet, but I confirmed that it compiles on my OS X machine ("make an_ss_ge_fi" produced a proper executable).

Meanwhile D-Wave continues to receive good press (h/t web 238) as the evidence for substantial quantum effects (entanglement and tunneling) keep mounting, indicating that they can successfully keep decoherence at bay.  (This older post on their corporate blog gives an excellent introduction to decoherence, illustrating how thermal noise gets in the way of quantum annealing).

On the other hand, despite enormous theoretical efforts, it is yet not decisively shown what quantum resources are required for the elusive quantum speed-up, but this recent paper in Nature (arxiv link) claims to have made some substantial headway in isolating some of the necessary ingredients (h/t Theo).

While this is all quite fascinating in the academic sense, at this time I nevertheless regard quantum speed-up as overrated from a practical point of view. Once I manage to set some time aside I will try to explain in my next blog post why I don't think that this has any bearing on D-Wave's value proposition. There's no place in the multiverse for the snarky, dark movie script that Robert Tucci offered up the other day :-)

An Example for a two Qubits Contextuality Graph

Posted in D-Wave, Quantum Computing | 14 Comments

A Bet Lost, Despite Entanglement

And Why There Still May be Some Free Cheese in my Future.

Occasionally I like to bet. And in Matthias Troyer I found somebody who took me up on it.  I wrote about this bet a while ago, but back then I agreed that I wouldn't identify him as my betting pal, until his paper was published. Now the paper has been out for a while and it is high time to settle the first part of this bet.

The conditions were straightforward, can the D-Wave machine beat a single classical CPU?  But of course we specified things a bit more precisely.

The benchmark used is the time to find the ground state with 99% probability, and then not only the median is considered but also the 90%, 95% and 99% quantile. We then agreed on basing the bet on the 90% quantile. I.e. the test needs to run long enough to make sure that for 90% or more of the instances we find a ground state with 99%.

Assuming that Matthias gets to conduct his testing on the current and next chip generation of D-Wave, we agreed to make this a two part bet, i.e. same betting conditions for each.

Unfortunately, I have to concede the first round.  The D-Wave One more or less tied the classical machine, although there were some problem instances where it was doing better. So the following jars of Maple Syrup will soon be shipped to Zürich:

maple_syrup_debt

Not the most exquisite or expansive maple syrup, but 100% pure, and Canadian. The same kind I use at home, and I figure the plastic jars will tolerate shipping much better than glass.

What I was obviously hoping for was a decisively clear performance advantage, but at this point this isn't the case, unless you compare it to off-the-shelf optimizer software as was done in the test conducted by McGeoch et. al.

This, despite the evidence for quantum entanglement of D-Wave's machines getting ever more compelling. A paper has just been published in Pysical Review X, that demonstrates eight qubit entanglement. Geordie blogged about it here, and it already generated some great press (h/t Pierre O.), probably the most objective mainstream article on D-Wave I've seen yet. It is a welcome change from the drivel the BBC put out on QC in the past.

So will I ever get some Raclette cheese in return for my Maple syrup? The chances for winning the next part of my bet with Matthias hinge on the scaling behavior, as well as on the question if a class of hard problems can be identified where quantum annealing manages to find the ground state significantly faster. For the generic randomly generated problem set, scaling alone does not seem to cut it (although more data will be needed to be sure).  So I am counting on D-Wave's ingenuity, as well as those bright minds who now get to work hands-on with the machine.

Nevertheless, Matthias is confident he'll win the bet even at 2000 qubits. He thinks D-Wave will have to improve much more than just the calibration to outperform a single classic CPU. On the other hand, when I had the pleasure of meeting him last year in Waterloo, he readily acknowledged that it was impressive what the company had accomplished so far. After all, this is an architecture that was created within just ten years based on a shoestring budget, compared to the multimbillion dollar,  decades mature semiconductor industry.

Unfortunately, when his university, the venerated ETH Zürich (possibly the best engineering school on this planet) came out with this press release, they nevertheless (accidentally?) played into the old canard that D-Wave falsely claimed to have produced a universal quantum computer.

It puts into context the Chinese whisper process as depicted in this cartoon that I put up in an earlier post. Unlike depicted here, where the press gets most of the blame, ever since I started paying attention to university press releases, I am compelled to notice that they are more often than not the true starting point of the distortions.

"The Science Newscycle" by Jorge Cham www.phdcomics.com

 

 

 

 

 

Posted in D-Wave, Quantum Computing | Tagged | 13 Comments

D-Wave Withdrawal Relief

AAFor anybody needing an immediate dose of D-Wave news, Wired has this long, well researched article (Robert R. Tucci summarized it in visual form on his blog). It strikes a pretty objective tone, yet I find the uncritical acceptance of Scott Aaronson's definition of quantum productivity a bit odd.  As a theorist, Scott is only interested in quantum speed-up. That kind of tunnel vision is not inappropriate for his line of work, just an occupational hazard that goes with the job, but it doesn't make for a complete picture.

Other than that, the article only has some typical minor problems with QM.

At this point, you don't really expect a journalist to get across how gate model quantum algorithms work, and the article actually does this better than most. But the following bit is rather revealing; The writer, Clive Thompson, describes visually inspecting the D-Wave chip:

Peering in closely, I can just make out the chips, each about 3 millimeters square. The niobium wire for each qubit is only 2 microns wide, but it’s 700 microns long. If you squint very closely you can spot one: a piece of the quantum world, visible to the naked eye.

SQUIDs for magnetometers don't have to be very small. (Photo approximately to scale - as indicated by the handwriting on this lab sample). This is because for this application you want to measure the magnetic flux encompassed by the loop.

Innocuous enough quote, and most physicists wouldn't find anything wrong with it either, but therein lies the rub. SQUIDs can be fairly large (see photo to the right).

Any superconducting coil can harbour a coherent quantum state, and they can be huge.

The idea that quantum mechanics somehow only governs the microcosm has been with us from its inception, because that's what was experimentally accessible at the time i.e. atomic spectra.  But it is a completely outdated notion.

This is something I only fully came to grasp after reading Carvar Maed's brilliant little book on Collective Electrodynamics. In it, he makes a very compelling case that we are due for another paradigm change.  To me, the latter means dusting off some of Schrödinger's original wave mechanics ideas. If we were to describe a simple quantum algorithm using that picture, there's a much better chance to give non-physicists an idea of how these computation schemes work.

 

Posted in D-Wave, Popular Science, Quantum Computing, Quantum Mechanics | 12 Comments

Breaking Science News on the Blogosphere?

Update Below

Wrong!

It has been my long held belief that science media needs an additional corrective in the form of blogs, similar to the development we've seen in the political sphere.  Now it seems the news that the BICEP results, that were heralded as Nobel price worthy, may be wrong, originated with this blog post.

Certainly big enough news to interrupt my blog hiatus.

Maybe sometimes some results are really too good to be true, and this may turn out to be this year's version of the faster than light neutrinos.

Update

As was to be expected there is now some push-back against these claims, and the authors stand by the paper.

It also illustrates that science is a bit like sausages, sometimes you don't really want to know exactly what went into it.  At least that how I felt when I learned that the source for this controversy is the way that data has been scrapped from a PDF copy. One could hardly make a better case for why we need a good Open Science infrastructure.

Irrespective my favorite physics blogger took on the results and puts them into context.

 

Posted in Blogroll Rescue, Popular Science | 3 Comments

Time for Another Blogroll Memory Hole Rescue

CRA

The Canadian Revenue Agency is the equivalent to the IRS down South. They owe me money and always make me work to get it.

Unlike the US, tax returns in Canada are due by the end of April, but because of the Heartbleed bug, Revenue Canada had to take down electronic filing for a while, so the deadline has been extended a bit.  It seems I may need the extra days as life is keeping me extraordinarily busy. Saturday morning is usually my blogging time, but this weekend I had to look after my kids (my wife Sara was performing Beethoven 9th with the Peterborough Symphony) and today my oldest daughter turned seven, filling the day with Zoo visits and birthday cakes.

At least the bug bought me some more time.

So in order to not completely abandon this blog, a couple of links to other outstanding science musing are in order. To that end I would like to highlight some posts of Sabine Hossenfelder, a blogging physicist professor of theoretical physics currently teaching in Sweden. Her most recent post discusses some of the structural problems in Academia, which in reality is nothing like the commonly held notion of a utopian ivory tower (rather, the tower stands and becomes ever more compartmentalized, but there is nothing utopian about it).

Her post on the Problem of Now makes a nice primer for a long-planned future post of mine on Julian Barbour's End of Time, because arguably he took "Einstein's Blunder" and ran with it as far as one can take it.  The man's biography also ties back to the dilemma of academia, as it really doesn't  allow much space for such deep, and out of the mainstream, research programs.

Last but not least, I really enjoyed this rant.

And I probably should mention that Sabine also knows how to sing. It obviously takes a physicist to really muster the emotional impact of the agonizing ongoing demise of SUSY.

 

 

Posted in Blogroll Rescue, Popular Science | Tagged | 5 Comments

The Church of D-Wave

Are You a BDeliever? dwave_churchgoer Science and religion have a difficult relationship, and sometimes they combine in the most obscure manner, such as when Scientology was conceived.  The latter seems to have lost a lot of its appeal and followers, but it seems that another new religion is poised to grab the mantle.  That is, if one is willing to follow Scott Aaronson's rationale that believing in the achievability of significant speed-up with D-Wave's architecture is a matter of faith. Ironically Scott, who is teaching computer science at MIT, made this comment about the same time that the MIT Technology Review named D-Wave to its Top 50 Smartest Companies list. An illustrious selection, that any company would be delighted to be included in. The only quibble I have with this list is that it ranks Elon Musk's SpaceX before D-Wave, my point being that quantum mechanics is harder than rocket science. After all, with the latter, everybody can decide if your spacecraft made it into orbit or not (classical mechanics is so straightforward).  On the other hand, we still have the ongoing high profile battle over the question of how quantum D-Wave's machine actually is (since Schroedinger the uncertainty of what's in a box seems to be a constant in Quantum Mechanics).

Another paper buttresses the company's claims that there is substantial entanglement present on their chip.  This prompted Prof. Vazirani, who I experienced as a most delightful soft spoken academic when checking out his Quantum Computing MOC, to come out swinging.  The New York Times quotes him as saying:

“What I think is going on here is that they didn’t model the ‘noise’ correctly. (....) One should have a little more respect with the truth.”

In academic parlance these are fighting words.  And so the show goes on.

But I want to take a break from this for a moment, and focus on another question: How did a startup like D-Wave get to this point?  Time magazine front page material, coverage in the New York Times, being named in the same breath as SpaceX.  From a business perspective this is nothing but an amazing success story to have gotten to this point. And to me, the question of what makes successful entrepreneurship is of no less interest than science and technology.

Geordie_Gold

Geordie got closer to having a shot at Olympic gold than most of us, having been an alternate on the Canadian wrestling team at the 1996 Olympic Games, so getting this one may have been bitter sweet.

Flying into Vancouver I imagined Geordie Rose to be a Steve Jobs-like character, about whom it was famously quipped that he was surrounded by his own reality distortion field, an invisible force that made others see the world like he did, and made them buy into his vision. And although I never had the pleasure of meeting Steve Jobs, I think it is safe to say that Geordie is nothing like him. If I had to describe him in one word, I'd say he is quintessentially "Canadian", in terms of the positive attributes that we typically like to associate with our national character. (Full disclaimer: Technically I am not Canadian yet, just a permanent resident).

Given the amazing success that D-Wave has had, and the awards and accolades that he himself has received, I was impressed with his unassuming demeanor. Hard to imagine Geordie would ever park his car in a handicap spot, as Jobs was fond of doing, to shave a couple minutes off his commute.

D-Wave just moved to a new enlarged premises. In their old building Geordie occupied an interior office without windows. I naturally assumed that he would have upgraded that. So I was surprised to learn that his new workspace still doesn't have any windows. His explanation was simple, it allows him to be close to his team.

My take away is that visionaries cannot be pigeon-holed, because when talking to Geordie it was quickly obvious that his focus and dedication to making his vision a reality is ironclad, and his excitement is infectious.  So this is one key similarity to Steve Jobs after all, and then there is of course this, which goes without saying:

Great entrepreneurs never do it for the money.

Great entrepreneurs never do it for the money.

Prof. Vazirani must have picked up on D-Wave's commitment to make Quantum Computing work, as the New York Times also quotes him as saying about D-Wave that “after talking with them I feel a lot better about them. They are working hard to prove quantum computing.

That Geordie picked an approach which is so abhorred by theorists, I attribute to yet another aspect that, in my mind, marks great entrepreneurship: An almost ruthless pragmatism. Focusing on the less proven quantum annealing on a chip, he managed in just seven years to turn out an entirely new computing platform.  Meanwhile, the advances in superconducting foundry know-how that his company ushered in, will also benefit other approaches, such as the gate based implementation that UCSB's John Martinis plans to scale up to 1000 qubits within five years.

To me, there is no doubt that the hurry to get something to the market is a net benefit to the entire quantum computing field, as I expect it will attract more private capital. And that is because Quantum Computing is now no longer perceived as something nebulous, something that just may happen 25 years down the road.

Game changers polarize.  So if we pay heed to Scott Aaronson's rhetorics Geordie clearly has a leg up over Steve Jobs.  Where the latter had a cult following, Geordie's on his way to having his own religion.  Maybe that'll explain the following recent exchange on D-Wave's blog:

D-Wave_blog

 

(h/t Rolf D. and commenter Copenhagen for pointing me to material for this post.)

Posted in D-Wave | Tagged , , , | 22 Comments

Inflation not Over-Inflated after all?

Updated Below

Cosmology is quintessential popular science, but I always regarded it as the most dismal field of physics because there is no avenue for experiments to keep run-away speculations at bay. It's like trying to catch a perpetrator by staring at a multi-billion year old crime scene with the evidence all scattered.  And of course, since it deals with the beginning of time, scientists may have a hard time divorcing themselves from philosophical or religious beliefs (e.g. for a long time, Einstein presumably regarded the big bang theory as an invention by the clergy).

Given the ad-hoc nature of the cosmic inflation theory to fix problems with the big bang explanation, I always felt rather lukewarm about it. It just appeared too much like a convenient quick fix. But I am certainly warming up to it, given that the new detailed observations of the cosmic microwave radiation do fit the picture quite nicely. This radiation is essentially a convenient thermometer for the entire universe, as it can be regarded as thermal black body radiation. This is as close as we can get to the aforementioned primordial crime scene, taking advantage of the fact that the further we look into space, the earlier the events we observe.  If the mainstream big bang theory is correct, then the evidence for it must be splattered all over space encoded in this background radiation.

 (one of the finest pop science writers on the web) nicely explains why this data is such a treasure trove. I have little to add to his article other than the caveat that one should keep an open mind, that the evidence may yet still fit a completely different sequence of events (e.g. this one made some recent headlines, and it will be interesting to observe how such alternative models may be adapted to fit the newly released data).

And then there is, of course, the other raison d'etre of this blog, pointing out when popular science writing gets the details wrong.  The better outlets, such as the NYT, got it right when they wrote that this data offers the first direct evidence for gravitational waves as predicted by general relativity.  And a layman certainly can relate to this, simply by appreciating the released pictures, that almost look like ripples left in the sand by some ocean waves.

Slight temperature fluctuations, indicated by variations in color, of the cosmic microwave background of a small patch of sky (as provided by the BICEP2 Collaboration).

But there are a lot of press releases and news blurbs that leave out that crucial word "direct" when mentioning gravitational waves, ignoring the excellent indirect evidence that earned a Nobel prize in 1993. The latter is based on one of the neatest astronomical observations I can think of, which used the precise signal of a pulsar in a binary system to measure the declining orbit of the two stars. The observed orbital decay precisely matches the theoretical predictions of how much energy the system should disseminate via gravitational waves.

Just as accelerated electrical charges will under most circumstances emanate EM radiation, accelerated masses will send out gravitational waves, carrying away some of the kinetic energy of the system.

Of course, gravitational waves have the huge advantage of being the kind of physics accessible to immediate measurement, and this new cosmological evidence gives credence to the persistence in pushing for better gravitational wave detectors to eventually measure these waves directly.

Update

It didn't take long before some prominent push back, pointing to discrepancies between the BICEP2 data and previous data from the Planck and WMAP telescopes.

(h/t Sol Warda for prompting me to write this post) 

 

Posted in Popular Science | Tagged , , , , | 3 Comments

The Science Newscycle

As life keeps me otherwise busy, I am again late in finishing my next blog post, but in the meantime this web comic nicely summarizes much of the news dynamics of science in general and quantum computing in particular (h/t my lovely wife Sara).

"The Science Newscycle" by Jorge Cham www.phdcomics.com

 

Posted in Popular Science, Quantum Computing | Tagged , | 2 Comments