Category Archives: Quantum Computing

The Year That Was <insert expletive of your choice>

Usually, I like to start a new year on an upbeat note, but this time I just cannot find the right fit. I was considering whether to revisit technology that can clean waterlauding the effort of the Bill Gates foundation came to mind, but while I think this is a great step in the right direction, this water reclaiming technology is still a bit too complex and expensive to become truly transformational and liberating.

At other times, a groundbreaking progress in increasing the efficiency of solar energy would have qualified, the key being that this can be done comparatively cheaply. Alas, the unprecedented drop in the price of oil is not only killing off the fracking industry, but also the economics for alternative energy.  For a planet that has had its fill of CO2, fossil fuel this cheap is nothing but an unmitigated disaster.

So while it was a banner year for quantum computing, in many respects 2014 was utterly dismal, seeing the return of religiously motivated genocide, open warfare in Europe, a resurgence of diseases that could be eradicated by now, and a pandemic that caused knee jerk hysterical reactions that taught us how unprepared we are for these kind of health emergencies. This year was so depressing it makes me want to wail along to my favorite science blogger’s song about it (but then again I’d completely ruin it).

And there is another reason to not yet let go of the past, corrections:

With these corrections out of the way I will finally let go of 2014, but with the additional observation that in the world of quantum computing, the new year started very much in the same vein as the old, generating positive business news for D-Wave, which managed to just raise another 29 million dollars, while at the same time still not getting respect from some academic QC researchers.

I.H. Deutsch (please note, not the Deutsch but Ivan) states at the end of this interview:

  1. [1]The D-Wave prototype is not a universal quantum computer.
  2. [2]It is not digital, nor error-correcting, nor fault tolerant.
  3. [3]It is a purely analog machine designed to solve a particular optimization problem.
  4. [4]It is unclear if it qualifies as a quantum device.”

No issues with [1]-[3].  But how many times do classical algos have to be ruled out before D-Wave is finally universally accepted as a quantum annealing machine?  This is getting into climate change denying territory. It shouldn’t really be that hard to define what makes for quantum computation. So I guess we found a new candidate for D-Wave chief critic, after Scott Aaronson seems to have stepped down for good.

Then again, with a last name like Deutsch, you may have to step up your game to get some name recognition of your own in this field.  And there’s no doubt that controversy works.

So 2015 is shaping up to become yet another riveting year for QC news. And just in case you made the resolution that, this year, you will finally try to catch that rainbow, there’s some new tech for you.
SOURCE: chaosgiant.deviantart.com

 

 

Update: Almost forgot about this epic fail of popular science reporting at the tail end of 2014.  For now I leave it as an exercise to the reader to spot everything that’s wrong with it. Of course most of the blame belongs to PLoS ONE which supposedly practices peer review.

The Race Towards Universal Quantum Computing – Lost in Confusion

If headlines and news articles were all you had to go by when trying to form an opinion about quantum computing, you’d end up with one enormous migraine. For many years now, they have created a constant barrage of conflicting story lines.

QC_mess

For reasons known only to them, science news authors seem to have collectively decided to ignore that there are many competing approaches to quantum computing. This apparent inability to differentiate between architectures and computational models makes for a constant source of confusion, which is then augmented by the challenge to explain the conceptual oddities of quantum computing, such as entanglement.

For instance, most authors, even if they may already know this is wrong, run with the simplest trope about quantum computing, which has been repeated ad nauseum: The pretense that these machines can execute every possible calculation within their input scope in parallel. Hard to imagine a misconception that would be better designed to put up a goalpost that no man-made machine could ever reach.  Scott Aaronson is so incensed by this nonsense that it even inspired the title of his new book. It is truly a sorry state of affairs when even Nature apparently cannot find an author who doesn’t fall for it. Elizabeth Gibney’s recent online piece on quantum computing was yet another case in point. It starts off promising, as the subtitle is spot on:

After a 30-year struggle to harness quantum weirdness for computing, physicists finally have their goal in reach.

But then the reader’s mind is again poisoned with this nonsense:

Where a classical computer has to try each combination in turn, a quantum computer could process all those combinations simultaneously — in effect, carrying out calculations on every possible set of input data in parallel.

Part of the problem is that there exist no other easy concepts that a news author can quickly turn to when trying to offer up an explanation that a casual reader can understand, while at the same time having his mind blown.  (‘Wow, every possible combination at the same time!’ It’s like double rainbow all over again).

Here’s my attempt to remedy this situation, a simple example to illustrate the extended capabilities of quantum computing versus  classical machines. The latter are very fast, but when solving a complex puzzle, i.e. finding the lowest number in an unordered list, they have to take one stab at it at a time.  It is like attacking an abstract problem-space the way ancient mariners had to fathom the depth of the sea.  (Gauging the depth with a rope in this manner is the original meaning of the word ‘fathom’).

You may argue that having several guys fathoming at the same time will give you a ‘parallelizing’ speed-up, but you would have to be a Luddite to the core to convince yourself that this could ever measure up to echolocation. Just like the latter can perceive data from a larger patch of seafloor, quantum computing can leverage more than just local point data. But this comes at a price: The signal that comes back is not easy to interpret. It depends on the original set-up of the probing signal, and requires subsequent processing.

Like an echolocation system, a quantum computer doesn’t magically probe the entire configuration space. It ‘sees’ more, but it doesn’t provide this information in an immediately useful format.

The real challenge is to construct the process in a way that allows you to actually get the answer to the computational problem you are trying to solve. This is devilishly difficult, which is why there are so few quantum algorithms in existence.  There are no simple rules to follow. In order to create one, it requires first and foremost inspiration, and is as much art as science.  That is why, when I learned how Shor’s algorithm worked, I was profoundly astounded and awed by the inordinate creativity it must have taken to think up.

Regardless, if this was the only problem with Elizabeth Gibney’s article, that would just be par for the course. Yet, while reporting on Google’s efforts to build their own quantum computing chip, she manages to not even mention the other quantum computer Google is involved with, and that despite D-Wave publishing in Nature in 2011 and just last year in Nature Communications.

Maybe if she hadn’t completely ignored D-Wave, she may have thought to ask Martinis the most pressing question of all: What kind of chip will he build for Google? Everything indicates that it is yet another quantum annealer, but the quotes in the article make it sound as if he was talking about gate computing:

“It is still possible that nature just won’t allow it to work, but I think we have a decent chance.”

Obviously he can not possibly be referring to quantum annealing in this context, since that clearly works just fine with fairly large numbers of qubits (as shown in the above mentioned Nature publication).

The current state of news reporting on quantum computing is beyond frustrating. There is a very real and fascinating race underway for the realization of the first commercially useful universal quantum computer. Will it be adiabatic or the gate model?  Are quantum cellular automatons still in the running?

But of course in order to report on this, you must first know about these differences. Apparently, when it comes to science news reporting, this is just too much to expect.

The Nature article also contains this little piece of information:

… the best quantum computers in the world are barely able to do school-level problems such as finding the prime factors of the number 21. (Answer: 3 and 7.)

I guess the fact that the answer is provided gives us a hint as to what level of sophistication the author expects from her audience, which in turn must be terribly confused to see a headline such as “New largest number factored on a quantum device is 56,153“.

This is of course not done with Shor’s algorithm but via adiabatic computing (and also involves some slight of hand as the algorithm only works for a certain class of numbers and not all integers).

Nevertheless, adiabatic computing seems to have the upper hand when it comes to scaling the problem scope with a limited number of qubits. But the gate model also made some major news last month.  The guinea pig Simon’s algorithm (one of the first you will learn when being introduced to the field) has been demonstrated to provide the theoretically predicted quantum speed-up. This is huge news that was immediately translated to the rather misleading headline “Simon’s algorithm run on quantum computer for the first time—faster than on standard computer“.

Faster in this case means less processing iterations rather than actual elapsed time, but irrespective, having this theoretical prediction confirmed using the fairly recent one-way technique clearly bolsters the case that gate computing can deliver the goods.

No doubt, the race between the  architectures to deliver the first commercial-grade universal quantum computer is on.  It is still wide open, and makes for a compelling story. Now, if we could only get somebody to properly report on it.

 

 

Progressing from the God Particle to the Gay Particle

… and other physics and QC news

The ‘god particle’, aka the Higgs boson, received a lot of attention, not that this wasn’t warranted, but I can’t help but suspect that the justification of the CERN budget is partly to blame for the media frenzy.  The gay particle, on the other hand, is no less spectacular – especially since its theoretical prediction by far pre-dates the Higgs boson.  Of course, what has been discovered is, yet again, not a real particle but ‘only’ a pseudo particle similar to the magnetic monopol that has been touted recently.  And as usual, most pop-science write-ups fail entirely to remark on this rather fundamental aspect (apparently the journalists don’t want to bother their audience with these boring details). In case you want to get a more complete picture this colloquium paper gives you an in-depth overview.

On the other hand, a pseudo particle quantum excitation in a 2d superconductor is exactly what the doctor ordered for topological quantum computing, a field that has seen tremendous theoretical progress as it has been generously sponsored by Microsoft. This research entirely hinges on employing these anyon pseudoparticles as a hardware resource, because they have the fantastic property of allowing for inherently decoherence-resistant qubits.  This is as if theoretical computer science would have started writing the first operating system in the roaring twenties of the last century, long before there was a computer or even a transistor, theorizing that a band gap in doped semiconductors should make it possible to build one. If this analogy was to hold, we’d now be at the stage where a band gap has been demonstrated for the first time.  So here’s to hoping this means we may see the first anyon-based qubit within the decade.

In the here and now of quantum computing, D-Wave merrily stays the course despite the recent Google bombshell news.  It has been reported that they now have 12 machines operational, used in a hosted manner by their strategic partners (such as 1Qbit).  They also continue to add staff from other superconducting outfits i.e. recently Bill Blake left Cray to join the company as VP of R&D.

Last but not least, if you are interested in physics you would have to live under a rock not to have heard about the sensational news that numerical calculations presumably proofed that black holes cannot form and hence do not exist.  Sabine Hossenfelder nicely deconstructs this.  The long and short of it is that this argument has been going on for a long time, that the equations employed in this research has some counter-intuitive properties, and that the mass integral employed is not all that well-motivated.

Einstein would have been happy if this pans out, after all this research claims to succeed where he failed, but the critical reception of this numerical model has just begun. It may very well be torn apart like an unlucky astronaut in a strongly in-homogeneous gravitational field.

This concludes another quick round-up post. I am traveling this week and couldn’t make the time for a longer article, but I should find my way back to a more regular posting schedule next week.

What Defines a Quantum Computer?

Could run Minecraft, but you’d have to be comfortable with getting you blocks as binary strings.

Recently a friend of mine observed in an email discussion “I must admit I find it a little difficult to keep up with the various definitions of quantum computing.”

A healthy sign for an enlightened confusion, because this already sets him apart from most people who still have yet to learn about this field, and at best think that all quantum computers are more or less equivalent.

As computers became an integral part of peoples everyday lives, they essentially learn the truth of Turing completeness – even if they have never heard the term.  Now, even a child exposed to various computing devices will quickly develop a sense that whatever one computer can do, another should be able to perform as well, with some allowance for the performance specs of the machine.  Older, more limited machines may not be able to run a current software for compatibility or memory scaling reasons, but there is no difference in principle that would prevent any computer from executing whatever has already been proven to work on another machine.

In the quantum computing domain, things are less clear cut. In my earlier post where I tried my hand at a quantum computing taxonomy, I focused on maturity of the technology, less so on the underlying theoretical model. However, it is the dichotomy in the latter that has been driving the heated controversy of D-Wave’s quantumness.

When David Deutsch wrote his seminal paper, he followed in Turing’s footsteps, thinking through the consequences of putting a Turing machine into quantum superposition. This line of inquiry eventually gave rise to the popular gate model of quantum computing.

D-Wave, on the other hand, gambled on adiabatic quantum computing, and more specifically, an implementation of quantum annealing.  In preparation for this post I sought to look up these terms in my copy of Nielsen and Chuang’s ‘Quantum Computation and Quantum Information’ textbook.  To my surprise, neither term can be found in the index, and this is the 2010 anniversary edition.  Now, this is not meant to knock the book, and if you want to learn about the gate model I think you won’t find a better one. It just goes to show that neither the adiabatic nor annealing approach was on the academic radar when the book was originally written – the first paper on adiabatic quantum computation (Farhi et al.) was published the same year as the first edition of this standard QIS textbook.

At the time it was not clear how the computational powers of the adiabatic approach compared to the quantum gate model. Within a year, Vazirani et al. published a paper that showed that Grover Search can be implemented on this architecture with quantum speed-up.  And although the notoriety of Shore’s algorithm overshadows Grover’s, the latter has arguably much more widespread technological potential. The Vazirani et al. paper also demonstrated that there will be problem instances that this QC model will not be able to solve efficiently, even though they can be tackled classically.

In 2004 a paper was submitted with a title that neatly sums it up: “Adiabatic Quantum Computation is Equivalent to Standard Quantum Computation” (Lloyd et al.)

If D-Wave had aimed for universal adiabatic quantum computation, maybe it would not have experienced quite as much academic push-back, but they pragmatically went after some lower hanging fruit i.e, quantum annealing. (Notwithstanding, this doesn’t stop  MIT’s Seth Lloyd from claiming that the company uses his ideas when pitching his own QC venture).

An adiabatic quantum computing algorithm encodes a problem into a cost, or in this case energy function, that is then explored for its absolute minimum. For instance, if you try to solve the traveling salesman problem your cost function would simply be distance traveled for each itinerary. A simple classical gradient descent algorithm over this energy ‘landscape’ will quickly get stuck in a local minimum (for an analog think of balls rolling down the hilly landscape collecting at some bottom close to were they started and you get the idea).  A truly quantum algorithm, on the other hand, can exploit the ‘spooky’ quantum properties, such as entanglement and the tunnel effect . In essence, it is as if our rolling balls could somehow sense that there is a deeper valley adjacent to their resting place and “tunnel through” the barrier (hence the name).  This gives these algorithms some spread-out look-ahead capabilities.  But depending on your energy function, this may still not be enough.

The graph bellow illustrates this with a completely made-up cost function, that while entirely oversimplified, hopefully still somewhat captures the nature of the problem. To the extent that the look-ahead capabilities of an adiabatic algorithm are still locally limited, long flat stretches with a relative minimum (a ‘plain’ in the energy landscape)  can still defeat it. I threw in some arbitrary Bell curves as a stand in for this local quantum ‘fuzziness’ (the latter incidentally the correct translation for what Heisenberg called his famous relation).

To the left, this fuzzy width doesn’t stretch outside the bounds of the flat stretch (or rather, it is negligibly small outside any meaningful neighborhood of this local minimum).

On the other hand, further to the right there is some good overlap between the local minimum closest to the absolute one (overlayed with the bell curve in green).  This is where the algorithm will perform well.troubling_energy_fktD-Wave essentially performs such an algorithm with the caveat that it does not allow completely arbitrary energy functions, but only those that can be shoe-horned into the Ising model.

This was a smart pragmatic decision on their part because this model was originally created to describe solid state magnets that were imagined as little coupled elementary magnetic dipoles, and the latter map perfectly to the superconducting magnetic fluxes that are implemented on the chip.

In terms of complexity, even in a simple classical 2-d toy model, the amount of possible combinations is pretty staggering as the video below nicely demonstrates. The corresponding energy function (Hamiltonian in QM) is surprisingly versatile an can encode a large variety of problems.

 

 

 

 

 

 

 

 

The Google-Martinis Chip Will Perform Quantum Annealing

Ever since the news that John M. Martinis will join Google to develop a chip based on the work that has been performed at UCSB, speculations abound as to what kind of quantum architecture this chip will implement.  According to this report, it is clear now that it will be adiabatic quantum computing:

But examining the D-Wave results led to the Google partnership. D-Wave uses a process called quantum annealing. Annealing translates the problem into a set of peaks and valleys, and uses a property called quantum tunneling to drill though the hills to find the lowest valley. The approach limits the device to solving certain kinds of optimization problems rather than being a generalized computer, but it could also speed up progress toward a commercial machine. Martinis was intrigued by what might be possible if the group combined some of the annealing in the D-Wave machine with his own group’s advances in error correction and coherence time.
“There are some indications they’re not going to get a quantum speed up, and there are some indications they are. It’s still kind of an open question, but it’s definitely an interesting question,” Martinis said. “Looking at that, we decided it would be really interesting to start another hardware approach, looking at the quantum annealer but basing it on our fabrication technology, where we have qubits with very long memory times.”

This leads to the next question: Will this Google chip be indeed similarly restricted to implementing the Ising model like D-Wave, or strive for more universal adiabatic quantum computation? The later has theoretically been shown to be computationally equivalent to gate based QC. It seems odd to just aim for a marginal improvement of the existing architecture as this article implicates.

At any rate, D-Wave may retain the lead in qubit numbers for the foreseeable future if it sticks to no, or less costly, error correction schemes (leaving it to the coders to create their own). It will be interesting to eventually compare which approach will offer more practical benefits.

About that Google Quantum Chip

In light of the recent news that John Martinis is joining Google, it is worthwhile to check out this Google talk from last year:

It is an hour long talk but very informative. John Martinis does an excellent job at explaining, in very simple terms, how hardware-based surface code error correction works.

Throughout the talk he uses the Gate model formalism.  Hence it is quite natural to assume that this is what the Google chip will aim for. This is certainly reinforced by the fact that other publications, such as from the IEEE, have also drawn a stark contrast between the Martinis approach, and D-Wave’s quantum annealing architecture. This is certainly how I interpreted the news as well.

But on second thought, and careful parsing of the press releases, the case is not as clear cut. For instance, Technology Review quotes Martinis in this fashion:

“We would like to rethink the design and make the qubits in a different way,” says Martinis of his effort to improve on D-Wave’s hardware. “We think there’s an opportunity in the way we build our qubits to improve the machine.”

This sounds more like Martinis wants to build a quantum annealing chip based on his logical, error corrected qubits.  From an engineering stand-point this would make sense, as this should be easier to achieve than a fully universal gate-based architecture, and it will address the key complaint that I heard from developers programming the D-Wave chip i.e. that they really would like to see error correction implemented on the chip.

On the other hand, in light of Martinis presentation, I presume that he will regard such an architecture simply as another stepping stone towards universal quantum computation.

News Roundup

headlines

 

As school starts, I should find my way back to a regular blogging schedule. I usually drive my kids to German Saturday school and then pass the time at a nearby Starbucks updating this blog.

Job and family demanded too much of my time this summer. The former has gotten very interesting, as I am documenting a bank stress testing system, but the learning curve is steep. And while I just had a pleasant one week vacation at a pristine Northern lake, it very much lacked in Wifi connectivity and was not conducive to blogging. Yet, I had plenty of time to read up on material that will make for future posts.

Back home, my kids incidentally watched the Nova episode that features D-Wave and Geordie Rose, which prompted my mother-in-law to exclaim that she wants stock in this company. Her chance to act on this may come in the not too distant future. Recently, D-Wave’s CEO hinted for the first time that there may be an IPO in the offing (h/t Rolf D).

Readers who follow the QC blogs have undoubtedly already learned about an interesting paper that supports D-Wave’s approach, since Geordie highlighted it on the company’s blog. The fact that Robert R. Tucci is looking for an experienced business partner to start a QC algorithm venture with may also already qualify as old news – Bob is mostly focused on the Gate model, but is agnostic about the adiabatic approach, and certainly displays an impressive grit and track record in consistently turning out patents and papers.

When it comes to love and business, timing is everything. The US allows for software patent protection of up to 20 years. This is a sufficiently long time frame to bet on Gate QC becoming a reality. But there is still a bit of a chicken and egg problem associated with this technology. After all, it is much more difficult (Geordie Rose would argue unrealistically so) then what D-Wave is doing. Shor’s algorithm alone cannot justify the necessary R&D expense to develop and scale up the required hardware, but other commercially more interesting algorithms very well may. Yet you only invest in developing those if there is a chance that you’ll eventually (within 20 years) have hardware to run them on. Currently, it still falls to academia to breach the gap, e.g. such as these Troyer et al. papers that make hope that quantum chemistry could see tangible speed-up from even modestly sized gate based quantum computers.

While quantum computing will remain a main theme of this blog, I intend to also get back to some more biographical posts that reflect on how the history of physics has evolved. Just as any human history, it is full of the oddest turns and twists that are more often than not edited out of the mainstream narrative. And just to be clear, this is not to suggest some grand conspiracy, but just another expression of the over-simplification that afflicts most popular science writing. Writing for the least common denominator makes often for rather poor results, but just as Sabine observes

The “interested public” is perfectly able to deal with some technical vocabulary as long as it comes with an explanation.

In the same vein, the intricacy of how scientific discovery progresses deserves some limelight as it illuminates the roads less traveled. It also makes for interesting thought experiments, imagining how physics may have developed if certain experiments or math had been discovered earlier, or one scientist’s life hadn’t been cut too short.

My next post will deal in some such idle speculation.

Update: This just in, Google sets out on its own (h/t bettinman), planning to put $8B into its proprietary QC hardware effort. which makes me wonder if the investment will match IBM’s $3B to reach the post silicon area.  Not clear yet what this will mean for their relationship with D-Wave.

The Business Case for D-Wave

A tried and tested success formula for lazy journalism is the build-up and tear-down pattern.

The hype comes before the fall. In the context of information technology, Gartner copyrighted the aptly named term “hype cycle”. Every technology starts out in obscurity, but some take off stellarly, promising the moon but, with the notable exception of the Apollo program, falling a bit short of that. Subsequently, disillusionment sets in, sometimes going as far as vilification of the technology/product. Eventually sentiments hit rock bottom, and a more balanced and realistic view is adopted as the technology is mainstreamed.

Even Web technology followed this pattern to some extent, and this was clearly mirrored by the dot com stock bubble. At the height of the exuberance, the web was credited with ushering in a New Economy that would unleash unprecedented productivity growth. By now it has, of course, vastly improved our lives and economies, it just didn’t happen quite as rapidly and profitably as the market originally anticipated.

D‑Wave’s technology will inevitably be subjected to the same roller coaster ride. When you make it to the cover of Time magazine, the spring for the tear down reaction has been set, waiting for the trigger. The latter came in the form of the testing performed by Mathias Troyer et al. While all of this is as expected in the field of journalism, I was a bit surprised to see that one of the sites I link to in my blogroll followed this pattern as well. When D‑Wave was widely shunned by academia, Robert R.Tucci wrote quite positively about them, but now seems to have given up all hope in their architecture in reaction to this one single paper. He makes the typical case against investing into D-Wave that I’ve seen many times argued by academics vested in the gate model of quantum computing.

The field came to prominence due to the theoretically clearly established potential of gate based quantum computers to outperform classical machines. And it was Shor’s algorithm that captured the public’s imaginations (and the NSA’s attention). Widely considered to be a NP-intermediate problem Shor’s algorithm could clearly crack our encryption schemes if we had gate based QC with thousands of qubits. Unfortunately, this is still sci-fi, and so the best that has been accomplished so far was the factorization of 21 based on this architecture. The quantum speed-up would be there if we had the hardware, but alas at this point it is the embodiment of something that is purely academic with no practical relevance whatsoever.

There is little doubt in my mind that a useful gate based quantum computer will be eventually built, just like, for instance, a space elevator. In both cases it is not a matter of ‘if’ but just a matter of ‘when’.

I’d wager we won’t see either within the next ten years.

Incidentally, it has been reported that a space elevator was considered as a Google Lab’s project, but subsequently thrown out as it requires too many fundamental technological breakthroughs in order to make it a reality. On the other hand, Google snapped up a D-Wave machine.

So is this just a case of acquiring trophy hardware, as some critics on Scott’s blog contended? I.e. nothing more than a marketing gimmick? Or have they been snookered? Maybe they, too, have a gambling addiction problem, like D-Wave investors as imagined on the qbnets blog?

Of course none of this is the case. It just makes business sense. And this is readily apparent as soon as you let go of the obsession over quantum speed-up.

Let’s just imagine for a second that there was a company with a computing technology that was neither transistor nor semiconductor based. Let’s further assume that within ten years they managed to rapidly mature this technology so that it caught up to current CPUs in terms of raw performance, and that this was all done with chip structures that are magnitudes larger than what current conventional hardware needs to deploy. Also this new technology does not suffer from loss currents introduced via accidental quantum tunneling, but is actually designed around this effect and utilizes it. Imagine that they did all this with a fraction of the R&D sums spend on conventional chip architectures, and since the power consumption scaling is radically different from current computers, putting another chip into the box will hardly double the energy consumed by the system.

A technology like this would almost be like the kind that IBM just announced to focus their research on, trying to find a way to the post-silicon future.

So our ‘hypothetical company’ sounds pretty impressive, doesn’t it? You’d think that a company like Google that has enormous computational needs would be very interested in test driving an early prototype of such a technology. And since all of the above applies to D‑Wave this is indeed exactly what Google is doing.

Quantum speed-up is simply an added bonus. To thrive, D‑Wave only needs to provide a practical performance advantage per KWh. The $10M up-front cost, on the other hand, is a non-issue. The machines are currently assembled like cars before the advent of the Ford Model T. Most of the effort goes into the cooling apparatus and interface with the chip, and there clearly will be plenty of opportunity to bring down manufacturing cost once production is revved up.

The chip itself can be mass-produced using adapted and refined Lithographic processes borrowed from the semi-conductor industry; hence the cost basis for a wafer of D‑Wave chips will not be that different from the chip in your Laptop.

Just recently, D‑Wave’s CEO mentioned an IPO for the first time in a public talk (h/t Rolf D). Chances are, the early D-Wave investors will be laughing at the naysayers all the way to the bank long before a gate based quantum computer factors 42.

moon
A book I have to regularly read to our three year old Luna. So far she refrained from also requesting a gate based quantum computer.

Just in Case You Want to Simulate an Ising Spin Glass Model

This is just a short update to include a link to the solver code developed by Matthias Troyer et al. that I discussed in my last blog post.

The advantage of conventional open source code is that everybody can play with it. To the extent that the code faithfully emulates quantum annealing of an Ising spin model, this will be a net benefit to D-Wave as it allows programmers to perform early testing of algorithmic concepts for their machine (similar to the emulator that is part of D-Wave’s evolving coding environment).

Matthias Troyer attached the source code to this earlier paper, and for easier download I put it into this zip file. As I am hard pressed for time these days, I didn’t get to work with it yet, but I confirmed that it compiles on my OS X machine (“make an_ss_ge_fi” produced a proper executable).

Meanwhile D-Wave continues to receive good press (h/t web 238) as the evidence for substantial quantum effects (entanglement and tunneling) keep mounting, indicating that they can successfully keep decoherence at bay.  (This older post on their corporate blog gives an excellent introduction to decoherence, illustrating how thermal noise gets in the way of quantum annealing).

On the other hand, despite enormous theoretical efforts, it is yet not decisively shown what quantum resources are required for the elusive quantum speed-up, but this recent paper in Nature (arxiv link) claims to have made some substantial headway in isolating some of the necessary ingredients (h/t Theo).

While this is all quite fascinating in the academic sense, at this time I nevertheless regard quantum speed-up as overrated from a practical point of view. Once I manage to set some time aside I will try to explain in my next blog post why I don’t think that this has any bearing on D-Wave’s value proposition. There’s no place in the multiverse for the snarky, dark movie script that Robert Tucci offered up the other day 🙂

An Example for a two Qubits Contextuality Graph

A Bet Lost, Despite Entanglement

And Why There Still May be Some Free Cheese in my Future.

Occasionally I like to bet. And in Matthias Troyer I found somebody who took me up on it.  I wrote about this bet a while ago, but back then I agreed that I wouldn’t identify him as my betting pal, until his paper was published. Now the paper has been out for a while and it is high time to settle the first part of this bet.

The conditions were straightforward, can the D-Wave machine beat a single classical CPU?  But of course we specified things a bit more precisely.

The benchmark used is the time to find the ground state with 99% probability, and then not only the median is considered but also the 90%, 95% and 99% quantile. We then agreed on basing the bet on the 90% quantile. I.e. the test needs to run long enough to make sure that for 90% or more of the instances we find a ground state with 99%.

Assuming that Matthias gets to conduct his testing on the current and next chip generation of D-Wave, we agreed to make this a two part bet, i.e. same betting conditions for each.

Unfortunately, I have to concede the first round.  The D-Wave One more or less tied the classical machine, although there were some problem instances where it was doing better. So the following jars of Maple Syrup will soon be shipped to Zürich:

maple_syrup_debt
Not the most exquisite or expansive maple syrup, but 100% pure, and Canadian. The same kind I use at home, and I figure the plastic jars will tolerate shipping much better than glass.

What I was obviously hoping for was a decisively clear performance advantage, but at this point this isn’t the case, unless you compare it to off-the-shelf optimizer software as was done in the test conducted by McGeoch et. al.

This, despite the evidence for quantum entanglement of D-Wave’s machines getting ever more compelling. A paper has just been published in Pysical Review X, that demonstrates eight qubit entanglement. Geordie blogged about it here, and it already generated some great press (h/t Pierre O.), probably the most objective mainstream article on D-Wave I’ve seen yet. It is a welcome change from the drivel the BBC put out on QC in the past.

So will I ever get some Raclette cheese in return for my Maple syrup? The chances for winning the next part of my bet with Matthias hinge on the scaling behavior, as well as on the question if a class of hard problems can be identified where quantum annealing manages to find the ground state significantly faster. For the generic randomly generated problem set, scaling alone does not seem to cut it (although more data will be needed to be sure).  So I am counting on D‑Wave’s ingenuity, as well as those bright minds who now get to work hands-on with the machine.

Nevertheless, Matthias is confident he’ll win the bet even at 2000 qubits. He thinks D-Wave will have to improve much more than just the calibration to outperform a single classic CPU. On the other hand, when I had the pleasure of meeting him last year in Waterloo, he readily acknowledged that it was impressive what the company had accomplished so far. After all, this is an architecture that was created within just ten years based on a shoestring budget, compared to the multimbillion dollar,  decades mature semiconductor industry.

Unfortunately, when his university, the venerated ETH Zürich (possibly the best engineering school on this planet) came out with this press release, they nevertheless (accidentally?) played into the old canard that D-Wave falsely claimed to have produced a universal quantum computer.

It puts into context the Chinese whisper process as depicted in this cartoon that I put up in an earlier post. Unlike depicted here, where the press gets most of the blame, ever since I started paying attention to university press releases, I am compelled to notice that they are more often than not the true starting point of the distortions.

“The Science Newscycle” by Jorge Cham www.phdcomics.com