The Google-Martinis Chip Will Perform Quantum Annealing

Ever since the news that John M. Martinis will join Google to develop a chip based on the work that has been performed at UCSB, speculations abound as to what kind of quantum architecture this chip will implement.  According to this report, it is clear now that it will be adiabatic quantum computing:

But examining the D-Wave results led to the Google partnership. D-Wave uses a process called quantum annealing. Annealing translates the problem into a set of peaks and valleys, and uses a property called quantum tunneling to drill though the hills to find the lowest valley. The approach limits the device to solving certain kinds of optimization problems rather than being a generalized computer, but it could also speed up progress toward a commercial machine. Martinis was intrigued by what might be possible if the group combined some of the annealing in the D-Wave machine with his own group's advances in error correction and coherence time.
"There are some indications they're not going to get a quantum speed up, and there are some indications they are. It's still kind of an open question, but it's definitely an interesting question," Martinis said. "Looking at that, we decided it would be really interesting to start another hardware approach, looking at the quantum annealer but basing it on our fabrication technology, where we have qubits with very long memory times."

This leads to the next question: Will this Google chip be indeed similarly restricted to implementing the Ising model like D-Wave, or strive for more universal adiabatic quantum computation? The later has theoretically been shown to be computationally equivalent to gate based QC. It seems odd to just aim for a marginal improvement of the existing architecture as this article implicates.

At any rate, D-Wave may retain the lead in qubit numbers for the foreseeable future if it sticks to no, or less costly, error correction schemes (leaving it to the coders to create their own). It will be interesting to eventually compare which approach will offer more practical benefits.

Posted in D-Wave, Quantum Computing, Uncategorized | Tagged , | 11 Comments

About that Google Quantum Chip

In light of the recent news that John Martinis is joining Google, it is worthwhile to check out this Google talk from last year:

It is an hour long talk but very informative. John Martinis does an excellent job at explaining, in very simple terms, how hardware-based surface code error correction works.

Throughout the talk he uses the Gate model formalism.  Hence it is quite natural to assume that this is what the Google chip will aim for. This is certainly reinforced by the fact that other publications, such as from the IEEE, have also drawn a stark contrast between the Martinis approach, and D-Wave's quantum annealing architecture. This is certainly how I interpreted the news as well.

But on second thought, and careful parsing of the press releases, the case is not as clear cut. For instance, Technology Review quotes Martinis in this fashion:

“We would like to rethink the design and make the qubits in a different way,” says Martinis of his effort to improve on D-Wave’s hardware. “We think there’s an opportunity in the way we build our qubits to improve the machine.”

This sounds more like Martinis wants to build a quantum annealing chip based on his logical, error corrected qubits.  From an engineering stand-point this would make sense, as this should be easier to achieve than a fully universal gate-based architecture, and it will address the key complaint that I heard from developers programming the D-Wave chip i.e. that they really would like to see error correction implemented on the chip.

On the other hand, in light of Martinis presentation, I presume that he will regard such an architecture simply as another stepping stone towards universal quantum computation.

Posted in D-Wave, Quantum Computing | Tagged , , , | 6 Comments

News Roundup

headlines

 

As school starts, I should find my way back to a regular blogging schedule. I usually drive my kids to German Saturday school and then pass the time at a nearby Starbucks updating this blog.

Job and family demanded too much of my time this summer. The former has gotten very interesting, as I am documenting a bank stress testing system, but the learning curve is steep. And while I just had a pleasant one week vacation at a pristine Northern lake, it very much lacked in Wifi connectivity and was not conducive to blogging. Yet, I had plenty of time to read up on material that will make for future posts.

Back home, my kids incidentally watched the Nova episode that features D-Wave and Geordie Rose, which prompted my mother-in-law to exclaim that she wants stock in this company. Her chance to act on this may come in the not too distant future. Recently, D-Wave's CEO hinted for the first time that there may be an IPO in the offing (h/t Rolf D).

Readers who follow the QC blogs have undoubtedly already learned about an interesting paper that supports D-Wave’s approach, since Geordie highlighted it on the company’s blog. The fact that Robert R. Tucci is looking for an experienced business partner to start a QC algorithm venture with may also already qualify as old news - Bob is mostly focused on the Gate model, but is agnostic about the adiabatic approach, and certainly displays an impressive grit and track record in consistently turning out patents and papers.

When it comes to love and business, timing is everything. The US allows for software patent protection of up to 20 years. This is a sufficiently long time frame to bet on Gate QC becoming a reality. But there is still a bit of a chicken and egg problem associated with this technology. After all, it is much more difficult (Geordie Rose would argue unrealistically so) then what D-Wave is doing. Shor’s algorithm alone cannot justify the necessary R&D expense to develop and scale up the required hardware, but other commercially more interesting algorithms very well may. Yet you only invest in developing those if there is a chance that you’ll eventually (within 20 years) have hardware to run them on. Currently, it still falls to academia to breach the gap, e.g. such as these Troyer et al. papers that make hope that quantum chemistry could see tangible speed-up from even modestly sized gate based quantum computers.

While quantum computing will remain a main theme of this blog, I intend to also get back to some more biographical posts that reflect on how the history of physics has evolved. Just as any human history, it is full of the oddest turns and twists that are more often than not edited out of the mainstream narrative. And just to be clear, this is not to suggest some grand conspiracy, but just another expression of the over-simplification that afflicts most popular science writing. Writing for the least common denominator makes often for rather poor results, but just as Sabine observes

The “interested public” is perfectly able to deal with some technical vocabulary as long as it comes with an explanation.

In the same vein, the intricacy of how scientific discovery progresses deserves some limelight as it illuminates the roads less traveled. It also makes for interesting thought experiments, imagining how physics may have developed if certain experiments or math had been discovered earlier, or one scientist's life hadn't been cut too short.

My next post will deal in some such idle speculation.

Update: This just in, Google sets out on its own (h/t bettinman), planning to put $8B into its proprietary QC hardware effort. which makes me wonder if the investment will match IBM's $3B to reach the post silicon area.  Not clear yet what this will mean for their relationship with D-Wave.

Posted in D-Wave, Popular Science, Quantum Computing | 15 Comments

The Business Case for D-Wave

A tried and tested success formula for lazy journalism is the build-up and tear-down pattern.

The hype comes before the fall. In the context of information technology, Gartner copyrighted the aptly named term “hype cycle”. Every technology starts out in obscurity, but some take off stellarly, promising the moon but, with the notable exception of the Apollo program, falling a bit short of that. Subsequently, disillusionment sets in, sometimes going as far as vilification of the technology/product. Eventually sentiments hit rock bottom, and a more balanced and realistic view is adopted as the technology is mainstreamed.

Even Web technology followed this pattern to some extent, and this was clearly mirrored by the dot com stock bubble. At the height of the exuberance, the web was credited with ushering in a New Economy that would unleash unprecedented productivity growth. By now it has, of course, vastly improved our lives and economies, it just didn’t happen quite as rapidly and profitably as the market originally anticipated.

D‑Wave’s technology will inevitably be subjected to the same roller coaster ride. When you make it to the cover of Time magazine, the spring for the tear down reaction has been set, waiting for the trigger. The latter came in the form of the testing performed by Mathias Troyer et al. While all of this is as expected in the field of journalism, I was a bit surprised to see that one of the sites I link to in my blogroll followed this pattern as well. When D‑Wave was widely shunned by academia, Robert R.Tucci wrote quite positively about them, but now seems to have given up all hope in their architecture in reaction to this one single paper. He makes the typical case against investing into D-Wave that I've seen many times argued by academics vested in the gate model of quantum computing.

The field came to prominence due to the theoretically clearly established potential of gate based quantum computers to outperform classical machines. And it was Shor’s algorithm that captured the public’s imaginations (and the NSA’s attention). Widely considered to be a NP-intermediate problem Shor’s algorithm could clearly crack our encryption schemes if we had gate based QC with thousands of qubits. Unfortunately, this is still sci-fi, and so the best that has been accomplished so far was the factorization of 21 based on this architecture. The quantum speed-up would be there if we had the hardware, but alas at this point it is the embodiment of something that is purely academic with no practical relevance whatsoever.

There is little doubt in my mind that a useful gate based quantum computer will be eventually built, just like, for instance, a space elevator. In both cases it is not a matter of ‘if’ but just a matter of ‘when’.

I’d wager we won’t see either within the next ten years.

Incidentally, it has been reported that a space elevator was considered as a Google Lab’s project, but subsequently thrown out as it requires too many fundamental technological breakthroughs in order to make it a reality. On the other hand, Google snapped up a D-Wave machine.

So is this just a case of acquiring trophy hardware, as some critics on Scott’s blog contended? I.e. nothing more than a marketing gimmick? Or have they been snookered? Maybe they, too, have a gambling addiction problem, like D-Wave investors as imagined on the qbnets blog?

Of course none of this is the case. It just makes business sense. And this is readily apparent as soon as you let go of the obsession over quantum speed-up.

Let’s just imagine for a second that there was a company with a computing technology that was neither transistor nor semiconductor based. Let’s further assume that within ten years they managed to rapidly mature this technology so that it caught up to current CPUs in terms of raw performance, and that this was all done with chip structures that are magnitudes larger than what current conventional hardware needs to deploy. Also this new technology does not suffer from loss currents introduced via accidental quantum tunneling, but is actually designed around this effect and utilizes it. Imagine that they did all this with a fraction of the R&D sums spend on conventional chip architectures, and since the power consumption scaling is radically different from current computers, putting another chip into the box will hardly double the energy consumed by the system.

A technology like this would almost be like the kind that IBM just announced to focus their research on, trying to find a way to the post-silicon future.

So our 'hypothetical company' sounds pretty impressive, doesn't it? You’d think that a company like Google that has enormous computational needs would be very interested in test driving an early prototype of such a technology. And since all of the above applies to D‑Wave this is indeed exactly what Google is doing.

Quantum speed-up is simply an added bonus. To thrive, D‑Wave only needs to provide a practical performance advantage per KWh. The $10M up-front cost, on the other hand, is a non-issue. The machines are currently assembled like cars before the advent of the Ford Model T. Most of the effort goes into the cooling apparatus and interface with the chip, and there clearly will be plenty of opportunity to bring down manufacturing cost once production is revved up.

The chip itself can be mass-produced using adapted and refined Lithographic processes borrowed from the semi-conductor industry; hence the cost basis for a wafer of D‑Wave chips will not be that different from the chip in your Laptop.

Just recently, D‑Wave’s CEO mentioned an IPO for the first time in a public talk (h/t Rolf D). Chances are, the early D-Wave investors will be laughing at the naysayers all the way to the bank long before a gate based quantum computer factors 42.

moon

A book I have to regularly read to our three year old Luna. So far she refrained from also requesting a gate based quantum computer.

Posted in D-Wave, Quantum Computing | 7 Comments

Fusion and Other News – Memory Hole Rescue

Another post on D-Wave is in my blog queue, but with all this attention on quantum computing my other favorite BC based high tech start-up doesn't get enough of my time - I haven't written anything on energy and fusion for quite a while, despite some dramatic recent news (h/t Theo) with regards to another dark horse fusion contender.

Fortunately, there is another excellent blog out there which is solely focused on fusion technology and the various concepts in the field. The Polywell is covered in depth, but General Fusion also gets is due, for its innovative technology.

Another focus of mine, the trouble with contemporary theoretical physics also keeps falling through the cracks.  From my past posts one may get the impression that I am just yet another String apostate, but I don't really have any trouble with String Theory as such, but rather with uncritical confirmation bias. Unfortunately, the latter cuts across all fields as nicely demonstrates in this recent post of hers.

Posted in Popular Science | Tagged , , | 3 Comments

Just in Case You Want to Simulate an Ising Spin Glass Model

This is just a short update to include a link to the solver code developed by Matthias Troyer et al. that I discussed in my last blog post.

The advantage of conventional open source code is that everybody can play with it. To the extent that the code faithfully emulates quantum annealing of an Ising spin model, this will be a net benefit to D-Wave as it allows programmers to perform early testing of algorithmic concepts for their machine (similar to the emulator that is part of D-Wave's evolving coding environment).

Matthias Troyer attached the source code to this earlier paper, and for easier download I put it into this zip file. As I am hard pressed for time these days, I didn't get to work with it yet, but I confirmed that it compiles on my OS X machine ("make an_ss_ge_fi" produced a proper executable).

Meanwhile D-Wave continues to receive good press (h/t web 238) as the evidence for substantial quantum effects (entanglement and tunneling) keep mounting, indicating that they can successfully keep decoherence at bay.  (This older post on their corporate blog gives an excellent introduction to decoherence, illustrating how thermal noise gets in the way of quantum annealing).

On the other hand, despite enormous theoretical efforts, it is yet not decisively shown what quantum resources are required for the elusive quantum speed-up, but this recent paper in Nature (arxiv link) claims to have made some substantial headway in isolating some of the necessary ingredients (h/t Theo).

While this is all quite fascinating in the academic sense, at this time I nevertheless regard quantum speed-up as overrated from a practical point of view. Once I manage to set some time aside I will try to explain in my next blog post why I don't think that this has any bearing on D-Wave's value proposition. There's no place in the multiverse for the snarky, dark movie script that Robert Tucci offered up the other day :-)

An Example for a two Qubits Contextuality Graph

Posted in D-Wave, Quantum Computing | 16 Comments

A Bet Lost, Despite Entanglement

And Why There Still May be Some Free Cheese in my Future.

Occasionally I like to bet. And in Matthias Troyer I found somebody who took me up on it.  I wrote about this bet a while ago, but back then I agreed that I wouldn't identify him as my betting pal, until his paper was published. Now the paper has been out for a while and it is high time to settle the first part of this bet.

The conditions were straightforward, can the D-Wave machine beat a single classical CPU?  But of course we specified things a bit more precisely.

The benchmark used is the time to find the ground state with 99% probability, and then not only the median is considered but also the 90%, 95% and 99% quantile. We then agreed on basing the bet on the 90% quantile. I.e. the test needs to run long enough to make sure that for 90% or more of the instances we find a ground state with 99%.

Assuming that Matthias gets to conduct his testing on the current and next chip generation of D-Wave, we agreed to make this a two part bet, i.e. same betting conditions for each.

Unfortunately, I have to concede the first round.  The D-Wave One more or less tied the classical machine, although there were some problem instances where it was doing better. So the following jars of Maple Syrup will soon be shipped to Zürich:

maple_syrup_debt

Not the most exquisite or expansive maple syrup, but 100% pure, and Canadian. The same kind I use at home, and I figure the plastic jars will tolerate shipping much better than glass.

What I was obviously hoping for was a decisively clear performance advantage, but at this point this isn't the case, unless you compare it to off-the-shelf optimizer software as was done in the test conducted by McGeoch et. al.

This, despite the evidence for quantum entanglement of D-Wave's machines getting ever more compelling. A paper has just been published in Pysical Review X, that demonstrates eight qubit entanglement. Geordie blogged about it here, and it already generated some great press (h/t Pierre O.), probably the most objective mainstream article on D-Wave I've seen yet. It is a welcome change from the drivel the BBC put out on QC in the past.

So will I ever get some Raclette cheese in return for my Maple syrup? The chances for winning the next part of my bet with Matthias hinge on the scaling behavior, as well as on the question if a class of hard problems can be identified where quantum annealing manages to find the ground state significantly faster. For the generic randomly generated problem set, scaling alone does not seem to cut it (although more data will be needed to be sure).  So I am counting on D-Wave's ingenuity, as well as those bright minds who now get to work hands-on with the machine.

Nevertheless, Matthias is confident he'll win the bet even at 2000 qubits. He thinks D-Wave will have to improve much more than just the calibration to outperform a single classic CPU. On the other hand, when I had the pleasure of meeting him last year in Waterloo, he readily acknowledged that it was impressive what the company had accomplished so far. After all, this is an architecture that was created within just ten years based on a shoestring budget, compared to the multimbillion dollar,  decades mature semiconductor industry.

Unfortunately, when his university, the venerated ETH Zürich (possibly the best engineering school on this planet) came out with this press release, they nevertheless (accidentally?) played into the old canard that D-Wave falsely claimed to have produced a universal quantum computer.

It puts into context the Chinese whisper process as depicted in this cartoon that I put up in an earlier post. Unlike depicted here, where the press gets most of the blame, ever since I started paying attention to university press releases, I am compelled to notice that they are more often than not the true starting point of the distortions.

"The Science Newscycle" by Jorge Cham www.phdcomics.com

 

 

 

 

 

Posted in D-Wave, Quantum Computing | Tagged | 14 Comments

D-Wave Withdrawal Relief

AAFor anybody needing an immediate dose of D-Wave news, Wired has this long, well researched article (Robert R. Tucci summarized it in visual form on his blog). It strikes a pretty objective tone, yet I find the uncritical acceptance of Scott Aaronson's definition of quantum productivity a bit odd.  As a theorist, Scott is only interested in quantum speed-up. That kind of tunnel vision is not inappropriate for his line of work, just an occupational hazard that goes with the job, but it doesn't make for a complete picture.

Other than that, the article only has some typical minor problems with QM.

At this point, you don't really expect a journalist to get across how gate model quantum algorithms work, and the article actually does this better than most. But the following bit is rather revealing; The writer, Clive Thompson, describes visually inspecting the D-Wave chip:

Peering in closely, I can just make out the chips, each about 3 millimeters square. The niobium wire for each qubit is only 2 microns wide, but it’s 700 microns long. If you squint very closely you can spot one: a piece of the quantum world, visible to the naked eye.

SQUIDs for magnetometers don't have to be very small. (Photo approximately to scale - as indicated by the handwriting on this lab sample). This is because for this application you want to measure the magnetic flux encompassed by the loop.

Innocuous enough quote, and most physicists wouldn't find anything wrong with it either, but therein lies the rub. SQUIDs can be fairly large (see photo to the right).

Any superconducting coil can harbour a coherent quantum state, and they can be huge.

The idea that quantum mechanics somehow only governs the microcosm has been with us from its inception, because that's what was experimentally accessible at the time i.e. atomic spectra.  But it is a completely outdated notion.

This is something I only fully came to grasp after reading Carvar Maed's brilliant little book on Collective Electrodynamics. In it, he makes a very compelling case that we are due for another paradigm change.  To me, the latter means dusting off some of Schrödinger's original wave mechanics ideas. If we were to describe a simple quantum algorithm using that picture, there's a much better chance to give non-physicists an idea of how these computation schemes work.

 

Posted in D-Wave, Popular Science, Quantum Computing, Quantum Mechanics | 12 Comments

Breaking Science News on the Blogosphere?

Update Below

Wrong!

It has been my long held belief that science media needs an additional corrective in the form of blogs, similar to the development we've seen in the political sphere.  Now it seems the news that the BICEP results, that were heralded as Nobel price worthy, may be wrong, originated with this blog post.

Certainly big enough news to interrupt my blog hiatus.

Maybe sometimes some results are really too good to be true, and this may turn out to be this year's version of the faster than light neutrinos.

Update

As was to be expected there is now some push-back against these claims, and the authors stand by the paper.

It also illustrates that science is a bit like sausages, sometimes you don't really want to know exactly what went into it.  At least that how I felt when I learned that the source for this controversy is the way that data has been scrapped from a PDF copy. One could hardly make a better case for why we need a good Open Science infrastructure.

Irrespective my favorite physics blogger took on the results and puts them into context.

 

Posted in Blogroll Rescue, Popular Science | 3 Comments

Time for Another Blogroll Memory Hole Rescue

CRA

The Canadian Revenue Agency is the equivalent to the IRS down South. They owe me money and always make me work to get it.

Unlike the US, tax returns in Canada are due by the end of April, but because of the Heartbleed bug, Revenue Canada had to take down electronic filing for a while, so the deadline has been extended a bit.  It seems I may need the extra days as life is keeping me extraordinarily busy. Saturday morning is usually my blogging time, but this weekend I had to look after my kids (my wife Sara was performing Beethoven 9th with the Peterborough Symphony) and today my oldest daughter turned seven, filling the day with Zoo visits and birthday cakes.

At least the bug bought me some more time.

So in order to not completely abandon this blog, a couple of links to other outstanding science musing are in order. To that end I would like to highlight some posts of Sabine Hossenfelder, a blogging physicist professor of theoretical physics currently teaching in Sweden. Her most recent post discusses some of the structural problems in Academia, which in reality is nothing like the commonly held notion of a utopian ivory tower (rather, the tower stands and becomes ever more compartmentalized, but there is nothing utopian about it).

Her post on the Problem of Now makes a nice primer for a long-planned future post of mine on Julian Barbour's End of Time, because arguably he took "Einstein's Blunder" and ran with it as far as one can take it.  The man's biography also ties back to the dilemma of academia, as it really doesn't  allow much space for such deep, and out of the mainstream, research programs.

Last but not least, I really enjoyed this rant.

And I probably should mention that Sabine also knows how to sing. It obviously takes a physicist to really muster the emotional impact of the agonizing ongoing demise of SUSY.

 

 

Posted in Blogroll Rescue, Popular Science | Tagged | 5 Comments