Category Archives: D-Wave

What Defines a Quantum Computer?

Could run Minecraft, but you’d have to be comfortable with getting you blocks as binary strings.

Recently a friend of mine observed in an email discussion “I must admit I find it a little difficult to keep up with the various definitions of quantum computing.”

A healthy sign for an enlightened confusion, because this already sets him apart from most people who still have yet to learn about this field, and at best think that all quantum computers are more or less equivalent.

As computers became an integral part of peoples everyday lives, they essentially learn the truth of Turing completeness – even if they have never heard the term.  Now, even a child exposed to various computing devices will quickly develop a sense that whatever one computer can do, another should be able to perform as well, with some allowance for the performance specs of the machine.  Older, more limited machines may not be able to run a current software for compatibility or memory scaling reasons, but there is no difference in principle that would prevent any computer from executing whatever has already been proven to work on another machine.

In the quantum computing domain, things are less clear cut. In my earlier post where I tried my hand at a quantum computing taxonomy, I focused on maturity of the technology, less so on the underlying theoretical model. However, it is the dichotomy in the latter that has been driving the heated controversy of D-Wave’s quantumness.

When David Deutsch wrote his seminal paper, he followed in Turing’s footsteps, thinking through the consequences of putting a Turing machine into quantum superposition. This line of inquiry eventually gave rise to the popular gate model of quantum computing.

D-Wave, on the other hand, gambled on adiabatic quantum computing, and more specifically, an implementation of quantum annealing.  In preparation for this post I sought to look up these terms in my copy of Nielsen and Chuang’s ‘Quantum Computation and Quantum Information’ textbook.  To my surprise, neither term can be found in the index, and this is the 2010 anniversary edition.  Now, this is not meant to knock the book, and if you want to learn about the gate model I think you won’t find a better one. It just goes to show that neither the adiabatic nor annealing approach was on the academic radar when the book was originally written – the first paper on adiabatic quantum computation (Farhi et al.) was published the same year as the first edition of this standard QIS textbook.

At the time it was not clear how the computational powers of the adiabatic approach compared to the quantum gate model. Within a year, Vazirani et al. published a paper that showed that Grover Search can be implemented on this architecture with quantum speed-up.  And although the notoriety of Shore’s algorithm overshadows Grover’s, the latter has arguably much more widespread technological potential. The Vazirani et al. paper also demonstrated that there will be problem instances that this QC model will not be able to solve efficiently, even though they can be tackled classically.

In 2004 a paper was submitted with a title that neatly sums it up: “Adiabatic Quantum Computation is Equivalent to Standard Quantum Computation” (Lloyd et al.)

If D-Wave had aimed for universal adiabatic quantum computation, maybe it would not have experienced quite as much academic push-back, but they pragmatically went after some lower hanging fruit i.e, quantum annealing. (Notwithstanding, this doesn’t stop  MIT’s Seth Lloyd from claiming that the company uses his ideas when pitching his own QC venture).

An adiabatic quantum computing algorithm encodes a problem into a cost, or in this case energy function, that is then explored for its absolute minimum. For instance, if you try to solve the traveling salesman problem your cost function would simply be distance traveled for each itinerary. A simple classical gradient descent algorithm over this energy ‘landscape’ will quickly get stuck in a local minimum (for an analog think of balls rolling down the hilly landscape collecting at some bottom close to were they started and you get the idea).  A truly quantum algorithm, on the other hand, can exploit the ‘spooky’ quantum properties, such as entanglement and the tunnel effect . In essence, it is as if our rolling balls could somehow sense that there is a deeper valley adjacent to their resting place and “tunnel through” the barrier (hence the name).  This gives these algorithms some spread-out look-ahead capabilities.  But depending on your energy function, this may still not be enough.

The graph bellow illustrates this with a completely made-up cost function, that while entirely oversimplified, hopefully still somewhat captures the nature of the problem. To the extent that the look-ahead capabilities of an adiabatic algorithm are still locally limited, long flat stretches with a relative minimum (a ‘plain’ in the energy landscape)  can still defeat it. I threw in some arbitrary Bell curves as a stand in for this local quantum ‘fuzziness’ (the latter incidentally the correct translation for what Heisenberg called his famous relation).

To the left, this fuzzy width doesn’t stretch outside the bounds of the flat stretch (or rather, it is negligibly small outside any meaningful neighborhood of this local minimum).

On the other hand, further to the right there is some good overlap between the local minimum closest to the absolute one (overlayed with the bell curve in green).  This is where the algorithm will perform well.troubling_energy_fktD-Wave essentially performs such an algorithm with the caveat that it does not allow completely arbitrary energy functions, but only those that can be shoe-horned into the Ising model.

This was a smart pragmatic decision on their part because this model was originally created to describe solid state magnets that were imagined as little coupled elementary magnetic dipoles, and the latter map perfectly to the superconducting magnetic fluxes that are implemented on the chip.

In terms of complexity, even in a simple classical 2-d toy model, the amount of possible combinations is pretty staggering as the video below nicely demonstrates. The corresponding energy function (Hamiltonian in QM) is surprisingly versatile an can encode a large variety of problems.

 

 

 

 

 

 

 

 

The Google-Martinis Chip Will Perform Quantum Annealing

Ever since the news that John M. Martinis will join Google to develop a chip based on the work that has been performed at UCSB, speculations abound as to what kind of quantum architecture this chip will implement.  According to this report, it is clear now that it will be adiabatic quantum computing:

But examining the D-Wave results led to the Google partnership. D-Wave uses a process called quantum annealing. Annealing translates the problem into a set of peaks and valleys, and uses a property called quantum tunneling to drill though the hills to find the lowest valley. The approach limits the device to solving certain kinds of optimization problems rather than being a generalized computer, but it could also speed up progress toward a commercial machine. Martinis was intrigued by what might be possible if the group combined some of the annealing in the D-Wave machine with his own group’s advances in error correction and coherence time.
“There are some indications they’re not going to get a quantum speed up, and there are some indications they are. It’s still kind of an open question, but it’s definitely an interesting question,” Martinis said. “Looking at that, we decided it would be really interesting to start another hardware approach, looking at the quantum annealer but basing it on our fabrication technology, where we have qubits with very long memory times.”

This leads to the next question: Will this Google chip be indeed similarly restricted to implementing the Ising model like D-Wave, or strive for more universal adiabatic quantum computation? The later has theoretically been shown to be computationally equivalent to gate based QC. It seems odd to just aim for a marginal improvement of the existing architecture as this article implicates.

At any rate, D-Wave may retain the lead in qubit numbers for the foreseeable future if it sticks to no, or less costly, error correction schemes (leaving it to the coders to create their own). It will be interesting to eventually compare which approach will offer more practical benefits.

About that Google Quantum Chip

In light of the recent news that John Martinis is joining Google, it is worthwhile to check out this Google talk from last year:

It is an hour long talk but very informative. John Martinis does an excellent job at explaining, in very simple terms, how hardware-based surface code error correction works.

Throughout the talk he uses the Gate model formalism.  Hence it is quite natural to assume that this is what the Google chip will aim for. This is certainly reinforced by the fact that other publications, such as from the IEEE, have also drawn a stark contrast between the Martinis approach, and D-Wave’s quantum annealing architecture. This is certainly how I interpreted the news as well.

But on second thought, and careful parsing of the press releases, the case is not as clear cut. For instance, Technology Review quotes Martinis in this fashion:

“We would like to rethink the design and make the qubits in a different way,” says Martinis of his effort to improve on D-Wave’s hardware. “We think there’s an opportunity in the way we build our qubits to improve the machine.”

This sounds more like Martinis wants to build a quantum annealing chip based on his logical, error corrected qubits.  From an engineering stand-point this would make sense, as this should be easier to achieve than a fully universal gate-based architecture, and it will address the key complaint that I heard from developers programming the D-Wave chip i.e. that they really would like to see error correction implemented on the chip.

On the other hand, in light of Martinis presentation, I presume that he will regard such an architecture simply as another stepping stone towards universal quantum computation.

News Roundup

headlines

 

As school starts, I should find my way back to a regular blogging schedule. I usually drive my kids to German Saturday school and then pass the time at a nearby Starbucks updating this blog.

Job and family demanded too much of my time this summer. The former has gotten very interesting, as I am documenting a bank stress testing system, but the learning curve is steep. And while I just had a pleasant one week vacation at a pristine Northern lake, it very much lacked in Wifi connectivity and was not conducive to blogging. Yet, I had plenty of time to read up on material that will make for future posts.

Back home, my kids incidentally watched the Nova episode that features D-Wave and Geordie Rose, which prompted my mother-in-law to exclaim that she wants stock in this company. Her chance to act on this may come in the not too distant future. Recently, D-Wave’s CEO hinted for the first time that there may be an IPO in the offing (h/t Rolf D).

Readers who follow the QC blogs have undoubtedly already learned about an interesting paper that supports D-Wave’s approach, since Geordie highlighted it on the company’s blog. The fact that Robert R. Tucci is looking for an experienced business partner to start a QC algorithm venture with may also already qualify as old news – Bob is mostly focused on the Gate model, but is agnostic about the adiabatic approach, and certainly displays an impressive grit and track record in consistently turning out patents and papers.

When it comes to love and business, timing is everything. The US allows for software patent protection of up to 20 years. This is a sufficiently long time frame to bet on Gate QC becoming a reality. But there is still a bit of a chicken and egg problem associated with this technology. After all, it is much more difficult (Geordie Rose would argue unrealistically so) then what D-Wave is doing. Shor’s algorithm alone cannot justify the necessary R&D expense to develop and scale up the required hardware, but other commercially more interesting algorithms very well may. Yet you only invest in developing those if there is a chance that you’ll eventually (within 20 years) have hardware to run them on. Currently, it still falls to academia to breach the gap, e.g. such as these Troyer et al. papers that make hope that quantum chemistry could see tangible speed-up from even modestly sized gate based quantum computers.

While quantum computing will remain a main theme of this blog, I intend to also get back to some more biographical posts that reflect on how the history of physics has evolved. Just as any human history, it is full of the oddest turns and twists that are more often than not edited out of the mainstream narrative. And just to be clear, this is not to suggest some grand conspiracy, but just another expression of the over-simplification that afflicts most popular science writing. Writing for the least common denominator makes often for rather poor results, but just as Sabine observes

The “interested public” is perfectly able to deal with some technical vocabulary as long as it comes with an explanation.

In the same vein, the intricacy of how scientific discovery progresses deserves some limelight as it illuminates the roads less traveled. It also makes for interesting thought experiments, imagining how physics may have developed if certain experiments or math had been discovered earlier, or one scientist’s life hadn’t been cut too short.

My next post will deal in some such idle speculation.

Update: This just in, Google sets out on its own (h/t bettinman), planning to put $8B into its proprietary QC hardware effort. which makes me wonder if the investment will match IBM’s $3B to reach the post silicon area.  Not clear yet what this will mean for their relationship with D-Wave.

The Business Case for D-Wave

A tried and tested success formula for lazy journalism is the build-up and tear-down pattern.

The hype comes before the fall. In the context of information technology, Gartner copyrighted the aptly named term “hype cycle”. Every technology starts out in obscurity, but some take off stellarly, promising the moon but, with the notable exception of the Apollo program, falling a bit short of that. Subsequently, disillusionment sets in, sometimes going as far as vilification of the technology/product. Eventually sentiments hit rock bottom, and a more balanced and realistic view is adopted as the technology is mainstreamed.

Even Web technology followed this pattern to some extent, and this was clearly mirrored by the dot com stock bubble. At the height of the exuberance, the web was credited with ushering in a New Economy that would unleash unprecedented productivity growth. By now it has, of course, vastly improved our lives and economies, it just didn’t happen quite as rapidly and profitably as the market originally anticipated.

D‑Wave’s technology will inevitably be subjected to the same roller coaster ride. When you make it to the cover of Time magazine, the spring for the tear down reaction has been set, waiting for the trigger. The latter came in the form of the testing performed by Mathias Troyer et al. While all of this is as expected in the field of journalism, I was a bit surprised to see that one of the sites I link to in my blogroll followed this pattern as well. When D‑Wave was widely shunned by academia, Robert R.Tucci wrote quite positively about them, but now seems to have given up all hope in their architecture in reaction to this one single paper. He makes the typical case against investing into D-Wave that I’ve seen many times argued by academics vested in the gate model of quantum computing.

The field came to prominence due to the theoretically clearly established potential of gate based quantum computers to outperform classical machines. And it was Shor’s algorithm that captured the public’s imaginations (and the NSA’s attention). Widely considered to be a NP-intermediate problem Shor’s algorithm could clearly crack our encryption schemes if we had gate based QC with thousands of qubits. Unfortunately, this is still sci-fi, and so the best that has been accomplished so far was the factorization of 21 based on this architecture. The quantum speed-up would be there if we had the hardware, but alas at this point it is the embodiment of something that is purely academic with no practical relevance whatsoever.

There is little doubt in my mind that a useful gate based quantum computer will be eventually built, just like, for instance, a space elevator. In both cases it is not a matter of ‘if’ but just a matter of ‘when’.

I’d wager we won’t see either within the next ten years.

Incidentally, it has been reported that a space elevator was considered as a Google Lab’s project, but subsequently thrown out as it requires too many fundamental technological breakthroughs in order to make it a reality. On the other hand, Google snapped up a D-Wave machine.

So is this just a case of acquiring trophy hardware, as some critics on Scott’s blog contended? I.e. nothing more than a marketing gimmick? Or have they been snookered? Maybe they, too, have a gambling addiction problem, like D-Wave investors as imagined on the qbnets blog?

Of course none of this is the case. It just makes business sense. And this is readily apparent as soon as you let go of the obsession over quantum speed-up.

Let’s just imagine for a second that there was a company with a computing technology that was neither transistor nor semiconductor based. Let’s further assume that within ten years they managed to rapidly mature this technology so that it caught up to current CPUs in terms of raw performance, and that this was all done with chip structures that are magnitudes larger than what current conventional hardware needs to deploy. Also this new technology does not suffer from loss currents introduced via accidental quantum tunneling, but is actually designed around this effect and utilizes it. Imagine that they did all this with a fraction of the R&D sums spend on conventional chip architectures, and since the power consumption scaling is radically different from current computers, putting another chip into the box will hardly double the energy consumed by the system.

A technology like this would almost be like the kind that IBM just announced to focus their research on, trying to find a way to the post-silicon future.

So our ‘hypothetical company’ sounds pretty impressive, doesn’t it? You’d think that a company like Google that has enormous computational needs would be very interested in test driving an early prototype of such a technology. And since all of the above applies to D‑Wave this is indeed exactly what Google is doing.

Quantum speed-up is simply an added bonus. To thrive, D‑Wave only needs to provide a practical performance advantage per KWh. The $10M up-front cost, on the other hand, is a non-issue. The machines are currently assembled like cars before the advent of the Ford Model T. Most of the effort goes into the cooling apparatus and interface with the chip, and there clearly will be plenty of opportunity to bring down manufacturing cost once production is revved up.

The chip itself can be mass-produced using adapted and refined Lithographic processes borrowed from the semi-conductor industry; hence the cost basis for a wafer of D‑Wave chips will not be that different from the chip in your Laptop.

Just recently, D‑Wave’s CEO mentioned an IPO for the first time in a public talk (h/t Rolf D). Chances are, the early D-Wave investors will be laughing at the naysayers all the way to the bank long before a gate based quantum computer factors 42.

moon
A book I have to regularly read to our three year old Luna. So far she refrained from also requesting a gate based quantum computer.

Just in Case You Want to Simulate an Ising Spin Glass Model

This is just a short update to include a link to the solver code developed by Matthias Troyer et al. that I discussed in my last blog post.

The advantage of conventional open source code is that everybody can play with it. To the extent that the code faithfully emulates quantum annealing of an Ising spin model, this will be a net benefit to D-Wave as it allows programmers to perform early testing of algorithmic concepts for their machine (similar to the emulator that is part of D-Wave’s evolving coding environment).

Matthias Troyer attached the source code to this earlier paper, and for easier download I put it into this zip file. As I am hard pressed for time these days, I didn’t get to work with it yet, but I confirmed that it compiles on my OS X machine (“make an_ss_ge_fi” produced a proper executable).

Meanwhile D-Wave continues to receive good press (h/t web 238) as the evidence for substantial quantum effects (entanglement and tunneling) keep mounting, indicating that they can successfully keep decoherence at bay.  (This older post on their corporate blog gives an excellent introduction to decoherence, illustrating how thermal noise gets in the way of quantum annealing).

On the other hand, despite enormous theoretical efforts, it is yet not decisively shown what quantum resources are required for the elusive quantum speed-up, but this recent paper in Nature (arxiv link) claims to have made some substantial headway in isolating some of the necessary ingredients (h/t Theo).

While this is all quite fascinating in the academic sense, at this time I nevertheless regard quantum speed-up as overrated from a practical point of view. Once I manage to set some time aside I will try to explain in my next blog post why I don’t think that this has any bearing on D-Wave’s value proposition. There’s no place in the multiverse for the snarky, dark movie script that Robert Tucci offered up the other day 🙂

An Example for a two Qubits Contextuality Graph

A Bet Lost, Despite Entanglement

And Why There Still May be Some Free Cheese in my Future.

Occasionally I like to bet. And in Matthias Troyer I found somebody who took me up on it.  I wrote about this bet a while ago, but back then I agreed that I wouldn’t identify him as my betting pal, until his paper was published. Now the paper has been out for a while and it is high time to settle the first part of this bet.

The conditions were straightforward, can the D-Wave machine beat a single classical CPU?  But of course we specified things a bit more precisely.

The benchmark used is the time to find the ground state with 99% probability, and then not only the median is considered but also the 90%, 95% and 99% quantile. We then agreed on basing the bet on the 90% quantile. I.e. the test needs to run long enough to make sure that for 90% or more of the instances we find a ground state with 99%.

Assuming that Matthias gets to conduct his testing on the current and next chip generation of D-Wave, we agreed to make this a two part bet, i.e. same betting conditions for each.

Unfortunately, I have to concede the first round.  The D-Wave One more or less tied the classical machine, although there were some problem instances where it was doing better. So the following jars of Maple Syrup will soon be shipped to Zürich:

maple_syrup_debt
Not the most exquisite or expansive maple syrup, but 100% pure, and Canadian. The same kind I use at home, and I figure the plastic jars will tolerate shipping much better than glass.

What I was obviously hoping for was a decisively clear performance advantage, but at this point this isn’t the case, unless you compare it to off-the-shelf optimizer software as was done in the test conducted by McGeoch et. al.

This, despite the evidence for quantum entanglement of D-Wave’s machines getting ever more compelling. A paper has just been published in Pysical Review X, that demonstrates eight qubit entanglement. Geordie blogged about it here, and it already generated some great press (h/t Pierre O.), probably the most objective mainstream article on D-Wave I’ve seen yet. It is a welcome change from the drivel the BBC put out on QC in the past.

So will I ever get some Raclette cheese in return for my Maple syrup? The chances for winning the next part of my bet with Matthias hinge on the scaling behavior, as well as on the question if a class of hard problems can be identified where quantum annealing manages to find the ground state significantly faster. For the generic randomly generated problem set, scaling alone does not seem to cut it (although more data will be needed to be sure).  So I am counting on D‑Wave’s ingenuity, as well as those bright minds who now get to work hands-on with the machine.

Nevertheless, Matthias is confident he’ll win the bet even at 2000 qubits. He thinks D-Wave will have to improve much more than just the calibration to outperform a single classic CPU. On the other hand, when I had the pleasure of meeting him last year in Waterloo, he readily acknowledged that it was impressive what the company had accomplished so far. After all, this is an architecture that was created within just ten years based on a shoestring budget, compared to the multimbillion dollar,  decades mature semiconductor industry.

Unfortunately, when his university, the venerated ETH Zürich (possibly the best engineering school on this planet) came out with this press release, they nevertheless (accidentally?) played into the old canard that D-Wave falsely claimed to have produced a universal quantum computer.

It puts into context the Chinese whisper process as depicted in this cartoon that I put up in an earlier post. Unlike depicted here, where the press gets most of the blame, ever since I started paying attention to university press releases, I am compelled to notice that they are more often than not the true starting point of the distortions.

“The Science Newscycle” by Jorge Cham www.phdcomics.com

 

 

 

 

 

D-Wave Withdrawal Relief

AAFor anybody needing an immediate dose of D-Wave news, Wired has this long, well researched article (Robert R. Tucci summarized it in visual form on his blog). It strikes a pretty objective tone, yet I find the uncritical acceptance of Scott Aaronson’s definition of quantum productivity a bit odd.  As a theorist, Scott is only interested in quantum speed-up. That kind of tunnel vision is not inappropriate for his line of work, just an occupational hazard that goes with the job, but it doesn’t make for a complete picture.

Other than that, the article only has some typical minor problems with QM.

At this point, you don’t really expect a journalist to get across how gate model quantum algorithms work, and the article actually does this better than most. But the following bit is rather revealing; The writer, Clive Thompson, describes visually inspecting the D-Wave chip:

Peering in closely, I can just make out the chips, each about 3 millimeters square. The niobium wire for each qubit is only 2 microns wide, but it’s 700 microns long. If you squint very closely you can spot one: a piece of the quantum world, visible to the naked eye.

SQUIDs for magnetometers don’t have to be very small. (Photo approximately to scale – as indicated by the handwriting on this lab sample). This is because for this application you want to measure the magnetic flux encompassed by the loop.

Innocuous enough quote, and most physicists wouldn’t find anything wrong with it either, but therein lies the rub. SQUIDs can be fairly large (see photo to the right).

Any superconducting coil can harbour a coherent quantum state, and they can be huge.

The idea that quantum mechanics somehow only governs the microcosm has been with us from its inception, because that’s what was experimentally accessible at the time i.e. atomic spectra.  But it is a completely outdated notion.

This is something I only fully came to grasp after reading Carvar Maed’s brilliant little book on Collective Electrodynamics. In it, he makes a very compelling case that we are due for another paradigm change.  To me, the latter means dusting off some of Schrödinger’s original wave mechanics ideas. If we were to describe a simple quantum algorithm using that picture, there’s a much better chance to give non-physicists an idea of how these computation schemes work.

 

The Church of D-Wave

Are You a BDeliever? dwave_churchgoer Science and religion have a difficult relationship, and sometimes they combine in the most obscure manner, such as when Scientology was conceived.  The latter seems to have lost a lot of its appeal and followers, but it seems that another new religion is poised to grab the mantle.

That is, if one is willing to follow Scott Aaronson’s rationale that believing in the achievability of significant speed-up with D-Wave’s architecture is a matter of faith. Ironically Scott, who is teaching computer science at MIT, made this comment about the same time that the MIT Technology Review named D-Wave to its Top 50 Smartest Companies list. An illustrious selection, that any company would be delighted to be included in. The only quibble I have with this list is that it ranks Elon Musk’s SpaceX before D-Wave, my point being that quantum mechanics is harder than rocket science. After all, with the latter, everybody can decide if your spacecraft made it into orbit or not (classical mechanics is so straightforward).  On the other hand, we still have the ongoing high profile battle over the question of how quantum D-Wave’s machine actually is (since Schroedinger the uncertainty of what’s in a box seems to be a constant in Quantum Mechanics).

Another paper buttresses the company’s claims that there is substantial entanglement present on their chip.  This prompted Prof. Vazirani, who I experienced as a most delightful soft spoken academic when checking out his Quantum Computing MOC, to come out swinging.  The New York Times quotes him as saying:

“What I think is going on here is that they didn’t model the ‘noise’ correctly. (….) One should have a little more respect with the truth.”

In academic parlance these are fighting words.  And so the show goes on.

But I want to take a break from this for a moment, and focus on another question: How did a startup like D-Wave get to this point?  Time magazine front page material, coverage in the New York Times, being named in the same breath as SpaceX.  From a business perspective this is nothing but an amazing success story to have gotten to this point. And to me, the question of what makes successful entrepreneurship is of no less interest than science and technology.

Geordie_Gold
Geordie got closer to having a shot at Olympic gold than most of us, having been an alternate on the Canadian wrestling team at the 1996 Olympic Games, so getting this one may have been bitter sweet.

Flying into Vancouver I imagined Geordie Rose to be a Steve Jobs-like character, about whom it was famously quipped that he was surrounded by his own reality distortion field, an invisible force that made others see the world like he did, and made them buy into his vision. And although I never had the pleasure of meeting Steve Jobs, I think it is safe to say that Geordie is nothing like him. If I had to describe him in one word, I’d say he is quintessentially “Canadian”, in terms of the positive attributes that we typically like to associate with our national character. (Full disclaimer: Technically I am not Canadian yet, just a permanent resident).

Given the amazing success that D-Wave has had, and the awards and accolades that he himself has received, I was impressed with his unassuming demeanor. Hard to imagine Geordie would ever park his car in a handicap spot, as Jobs was fond of doing, to shave a couple minutes off his commute.

D-Wave just moved to a new enlarged premises. In their old building Geordie occupied an interior office without windows. I naturally assumed that he would have upgraded that. So I was surprised to learn that his new workspace still doesn’t have any windows. His explanation was simple, it allows him to be close to his team.

My take away is that visionaries cannot be pigeon-holed, because when talking to Geordie it was quickly obvious that his focus and dedication to making his vision a reality is ironclad, and his excitement is infectious.  So this is one key similarity to Steve Jobs after all, and then there is of course this, which goes without saying:

Great entrepreneurs never do it for the money.
Great entrepreneurs never do it for the money.

Prof. Vazirani must have picked up on D-Wave’s commitment to make Quantum Computing work, as the New York Times also quotes him as saying about D‑Wave that “after talking with them I feel a lot better about them. They are working hard to prove quantum computing.

That Geordie picked an approach which is so abhorred by theorists, I attribute to yet another aspect that, in my mind, marks great entrepreneurship: An almost ruthless pragmatism. Focusing on the less proven quantum annealing on a chip, he managed in just seven years to turn out an entirely new computing platform.  Meanwhile, the advances in superconducting foundry know-how that his company ushered in, will also benefit other approaches, such as the gate based implementation that UCSB’s John Martinis plans to scale up to 1000 qubits within five years.

To me, there is no doubt that the hurry to get something to the market is a net benefit to the entire quantum computing field, as I expect it will attract more private capital. And that is because Quantum Computing is now no longer perceived as something nebulous, something that just may happen 25 years down the road.

Game changers polarize.  So if we pay heed to Scott Aaronson’s rhetorics Geordie clearly has a leg up over Steve Jobs.  Where the latter had a cult following, Geordie’s on his way to having his own religion.  Maybe that’ll explain the following recent exchange on D-Wave’s blog:

D-Wave_blog

 

(h/t Rolf D. and commenter Copenhagen for pointing me to material for this post.)

He Said She Said – How Blogs are Changing the Scientific Discourse

The debate about D-Wave‘s “quantumness” shows no signs of abating, hitting a new high note with the company being prominently featured on Time magazine’s recent cover, prompting a dissection of the article on Scott Aaronson’s blog. This was quickly followed by yet another scoop: A rebuttal by Umesh Vazirani to Geordie Rose who recently blogged about the Vazirani et al. paper which sheds doubt on D-Wave’s claim to implement quantum annealing. In his take on the Time magazine article Scott bemoans the ‘he said she said’ template of journalism which gives all sides equal weight, while acknowledging that the Times author Lev Grossman quoted him correctly, and obviously tries to paint an objective picture.

If I had to pick the biggest shortcoming of the Times article, my choice would have been different. I find Grossman entirely misses Scott’s role in this story by describing him as “one of the closest observers of the controversy“. Scott isn’t just an observer in this. For better or worse he is central to this controversy. As far as I can tell, his reporting on D-Wave’s original demo is what started it to begin with. Unforgettable, his inspired comparison of the D-Wave chip to a roast beef sandwich, which he then famously retracted when he resigned as D-Wave’s chief critic. The latter is something he’s done with some regularity, first when D-Wave started to publish results, then after visiting the company and most recently after the Troyer et al. pre-print appeared in arxiv (although the second time doesn’t seem to count, since it was just a reiteration of the first resignation).

And the say sandwiches and chips go together ...Scott’s resignations never seem to last long. D-Wave has a knack for pushing his buttons. And the way he engages D-Wave and associated research is indicative of a broader trend in how blogs are changing the scientific discourse. For instance, when Catherine McGeoch gave a talk about her benchmarking of the DW2, Scott did not immediately challenge her directly but took to his blog (a decision he later regretted and apologized for). Anybody who has spent more than five minutes on a Web forum knows how the immediate, yet text only, communication removes inhibitions and leads to more forceful exchanges. In the scientific context, this has the interesting effect of colliding head on with the more lofty perception of a scientist. It used to be that arguments were only conducted via scientific publications, in person such as in scientific seminars, or the occasional letter exchange. It’s interesting to contemplate how corrosive the arguments between Bohr and Einstein may have turned out, if they would have been conducted via blogs rather than in person. But it’s not all bad. In the olden days, science could easily be mistaken for a bloodless intellectual game, but nobody could read through the hundreds of comments on Scott’s blog that day and come away with that impression. To the contrary, the inevitable conclusion will be that science arguments are fought with no less passion than the most heated bar brawl.

During this epic blog ‘fight’ Scott summarized his preference for the media thusly

“… I think this episode perfectly illustrates both the disadvantages and the advantages of blogs compared to face-to-face conversation. Yes, on blogs, people misinterpret signals, act rude, and level accusations at each other that they never would face-to-face. But in the process, at least absolutely everything gets out into the open. Notice how I managed to learn orders of magnitude more from Prof. McGeoch from a few blog comments, than I did from having her in the same room …”

it is by far not the only controversy that he courted, nor is this something unique to his blog. Peter Woit continues the heretical work he started with his ‘Not Even Wrong’ book, Robert R. Tucci fiercely defends his quantum algorithm work when he feels he is not credited, Sabine Hossenfelder had to ban a highly qualified String theory troll due to his nastiness (she is also a mum of twins, so you know she has practice in being patient, and it’s not like she doesn’t have a good sense of humor). But my second favorite science blog fight also occurred on Scott’s blog when Joy Christian challenge him to a bet to promote his theory that supposedly invalidates the essential non-locality of quantum mechanics due to Bell’s theorem.

It’s instructive to look at the Joy Christian affair and ask how a mainstream reporter could have possibly reported it. Not knowing Clifford algebra, what could a reporter do but triangulate the expert opinions? There are some outspoken smart critics that point to mistakes in Joy Christian’s reasoning, yet he claims that these are based on flawed understanding and have been repudiated. The reporter will also note that doubting Bell’s theorem is very much a minority position, yet such a journalist not being able to check the math himself can only fall back on the ‘he said she said’ template. After all, this is not a simple straight forward fact like reporting if UN inspectors found Saddam Hussein’s weapons of mass distractions or not (something that surprisingly most mainstream media outside the US accomplished just fine). One cannot expect a journalist to settle an open scientific question.

The nature of the D-Wave story isn’t different, how is Lev Grossman supposed to do anything but report the various stances on each side of the controversy? A commenter at Scott’s blog was dismissively pointing out that he doesn’t even have a science degree. As if this were to make any difference, it’s not like everybody else on each side of the story doesn’t boast such degrees (non-PhDs are in the minority at D-Wave).

Mainstream media reports as they always did, but unsettled scientific questions are the exception to the rule, one of the few cases when ‘he said she said’ journalism is actually the best format. For everything else we fortunately now have the blogs.