Category Archives: Quantum Computing

Riding the D-Wave


Back in the day before he re-resigned as D-Wave's chief critic, Scott Aaronson made a well-reasoned argument as to why he thought this academic, and at times vitriolic, scrutiny was warranted. He argued that a failure of D-Wave to deliver a quantum speed-up would set the field back, similar to the AI winter that was triggered by Marvin Minsky's Perceptrons book.

Fortunately, quantum annealers are not perceptrons. For the latter, it can be rigorously proven that single layer perceptrons are not very useful. Ironically, at the time the book was published, multilayered perceptrons, i.e. a concept that is now fundamental to all deep learning algorithms, were already known, but in the ensuing backlash research funding for those also dried up completely. The term "perceptron" became toxic and is now completely extinct.

Could D-Wave be derailed by a proof that shows that quantum annealing could, under no circumstances, deliver a quantum speed-up? To me this seems very unlikely, not only because I expect that no such proof exists, but also because, even if this was the case, there will still be a practical speed-up to be had. If D-Wave manages to double their integration density at the same rapid clip as in the past, then their machines will eventually outperform any classical computing technology in terms of annealing performance. This article (h/t Sol) expands on this point.

So far there is no sign that D-Wave will slow its manic pace. The company recently released its latest chip generation, featuring quantum annealing with an impressive 1000+ qubits (in practice, the number will be smaller, as qubits will be consumed for problem encoding and software EEC). This was followed with a detailed test under the leadership of Catherine McGeoch, and it will be interesting to see what Daniel Lidar, and other researchers with access to D‑Wave machines, will find.

My expectation has been from the get-go that D-Wave will accelerate the development of this emerging industry, and attract more money to the field. It seems to me that this is now playing out.

Intel recently (and finally as Robert Tucci points out) entered the fray with a $50M investment. While this is peanuts for a company of Intel's size, it's an acknowledgement that they can't leave the hardware game to Google, IBM or start-ups such as Rigetti.

On the software side, there's a cottage industry of software start-ups hitching their wagons to the D-Wave engine. Many of these are still in stealth mode, or early stage such as QC Ware, while others already start to receive some well deserved attention.

Then there are also smaller vendors of established software and services that already have a sophisticated understanding of the need to be quantum ready. The latter is something I expect to see much more in the coming years as the QC hardware race heats up.

The latest big name entry into the quantum computing arena was Alibaba, but at this time it is not clear what this Chinese initiative will focus on. Microsoft, on the other hand, seems to be a known quantity and will not get aboard the D‑Wave train, but will focus exclusively on quantum gate computing.

Other start-ups, like our, straddle the various QC hardware approaches. In our case, this comes "out-of-the-box", because our core technology, Quantum Bayesian Networks, as developed by Robert Tucci, is an ideal tool to abstract from the underlying architecture. Another start-up that is similarly architecture agnostic is Cambridge QC. The recent news of this company brings to mind that sometimes reality rather quickly imitates satire. While short of the $1B seed round of this April Fool's spoof, the influx of $50M dollars from the Chile based Grupo Arcano is an enormous amount for a QC software firm, that as far as I know, holds no patents.

Some astoundingly big bets are now being placed in this field.






Classic Quantum Confusion

Paris_Tuileries_Facepalm_statueBy now I am pretty used to egregiously misleading summarization of physics research in popular science outlets, sometimes flamed by the researchers themselves. Also self-aggrandized, ignorant papers sneaked into supposedly peer reviewed journals by non-physicists are just par for the course.

But this is in a class of it's own.  Given the headline and the introductory statement that "a fully classical system behaves like a true quantum computer", it essentially creates the impression that QC research must be pointless. Much later it sneaks in the obvious, that an analog emulation just like one on a regular computer can't possibly scale past 40 qubits due to the exponential growth in required computational resources.

But that's not the most irritating aspect of this article.

Don't get me wrong, I am a big fan of classical quantum analog systems. I think they can be very educational, if you know what you are looking at (Spreeuw 1998).  The latter paper, is actually quoted by the authors and it is very precise in distinguishing between quantum entanglement and the classical analog. But that's not what their otherwise fine paper posits (La Cour et al. 2015).  The authors write:

"What we can say is that, aside from the limits on scale, a classical emulation of a quantum computer is capable of exploiting the same quantum phenomena as that of a true quantum system for solving computational problems."

If it wasn't for the reporting, I would put this down as sloppy wording that slipped past peer review, but if the authors are correctly quoted, then they indeed labour under the assumption that they faithfully recreated quantum entanglement in their classical analog computer - mistaking the model for the real thing.

It makes for a funny juxtaposition on though, when filtering by 'quantum physics' news.

Screenshot 2015-05-28 01.35.43

The second article refers to a new realization of Wheeler's delayed choice experiment (where the non-local entanglement across space is essentially swapped for one across time).

If one takes Brian La Cour at his word then according to his other paper he suggest that these kind of phenomena should also have a classical analog.

So it's not just hand-waving when he is making this rather outlandish sounding statement with regards to being able to achieve an analog to the violation of Bell's inequality:

"We believe that, by adding an emulation of quantum noise to the signal, our device would be capable of exhibiting this type of [Bell's inequality violating] entanglement as well, as described in another recent publication."

Of course talk is cheap, but if this research group could actually demonstrate this Bell's inequality loophole it certainly could change the conversation.

Will Super Cool SQUIDs Make for an Emerging Industry Standard?

This older logarithmic (!) D-Wave Graphic gives an idea how extreme the cooling requirement is for SQUID based QC (it used to be part of a really cool SVG animation, but unfortunately D-Wave no longer hosts it).

D‑Wave had to break new ground in many engineering disciplines.  One of them was the cooling and shielding technology required to operate their chip.

To this end they are now using ANSYS software, which of course makes for very good marketing for this company (h/t Sol Warda). So good, in fact, that I would hope D‑Wave negotiated a large discount for serving as an ANSYS reference customer.

Any SQUID based quantum computing chip will have similar cooling and shielding requirements, i.e. Google and IBM will have to go through a similar kind of rigorous engineering exercise to productize their approach to quantum computing, even though this approach may look quite different.

Until recently, it would have been easy to forget that IBM is another contender in the ring for SQUID based quantum computing, yet the company's researchers have been working diligently outside the limelight - they last created headlines three years ago. And unlike other quantum computing news, that often only touts marginal improvements, their recent results deserved to be called a break-through, as they improved upon the kind of hardware error correction that Google is betting on.

IBM has been conducting fundamental quantum technology research for a long time, this image shows the company's name spelled out using 35 xenon atoms, arranged via a scanning tunneling microscope (a nano visualization and manipulation device invented at IBM).

Obviously, the better your error correction, the more likely you will be able to achieve quantum speed-up when you pursue an annealing architecture like D‑Wave, but IBM is not after yet another annealer. Most articles on the IBM program reports that IBM is into building a  "real quantum computer”, and the term clearly originates from within the company, (e.g. this article attributes the term to Scientists at IBM Research in Yorktown Heights, NY). This leaves little doubt about their commitment to universal gate based QC.

The difference in strategy is dramatic. D‑Wave decided to forgo surface code error correction on the chip in order to get a device to the market.  Google, on the other hand, decided to snap up the best academic surface code implementation money could buy, and also emphasized speed-to-market by first going for another quantum adiabatic design.

All the while, IBM researchers first diligently worked through the stability of SQUID based qubits .  Even now, having achieved the best available error correction, they clearly signaled that they don't consider it good enough for scale-up. It may take yet another three years for them to find the optimal number and configuration of logical qubits that achieves the kind of fidelity they need to then tackle an actual chip.

It is a very methodological engineering approach. Once the smallest building block is perfected,  they will have the confidence that they can go for the moonshot. It's also an approach that only a company with very deep pockets can afford, one with a culture that allows for the pursuit of a decades long research program.

Despite the differences, in the end, all SQUID based chips will have to be operated very close to absolute zero.  IBM's error correction may eventually give it a leg-up over the competition, but I doubt that standard liquid helium fridge technology will suffice for a chip that implements dozens or hundreds of qubits.

By the time IBM enters the market there will be more early adopters of the D‑Wave and Google chips, and the co-opetition between these two companies may have given birth to an emerging industry standard for the fridge technology. In a sense, this may lower the barriers of entry for new quantum chips if the new entrant can leverage this existing infrastructure. It would probably be a first for IBM to cater to a chip interfacing standard that the company did not help to design.

So while there's been plenty of news in the quantum computing hardware space to report, it is curious, and a sign of the times, that a recent Washington Post article on the matter opted to headline with a Quantum Computing Software company i.e. QxBranch. (Robert R. Tucci channeled the journalists at the WP when he wrote last week that the IBM news bodes well for software start-ups in this space).

While tech and business journalists may not (and may possibly never) understand what makes a quantum computer tick, they understand perfectly well that any computing device is just dead weight without software, and that the latter will make the value proposition necessary to create a market for these new machines.




How many social networks do you need?

The proliferation of social networks seems unstoppable now. Even the big ones you can no longer count on one hand: Facebook, LinkedIn, GooglePlus, Twitter, Instagram, Tumblr, Pinterest, Snapchat - I am so uncool I didn't even know about the latter until very recently. It seems that there has to be a natural saturation point with diminishing marginal return of signing up to yet another one, but apparently we are still far from it.

Recently via LinkedIn I learned about a targeted social network that I happily signed up for, which is quite against my character (i.e. I still don't have a Facebook account).

Free to join and no strings attached. (This targeted social network is not motivated by a desire to monetize your social graph).

The aptly named International Quantum Exchange for Innovation is a social network set up by DK Matai with the express purpose of bringing together people of all walks of life anywhere on this globe who are interested in the next wave of the coming Quantum Technology revolution. If you are as much interested in this as I am, then joining this UN of Quantum Technology, as DK puts it, is a no-brainer.

The term 'revolution' is often carelessly thrown around, but in this case I think, when it comes to the new wave of quantum technologies, it is more than justified. After all, the first wave of QM driven technologies powered the second leg of the  Industrial Revolution. It started with a bang, in the worst possible manner, when the first nuclear bomb ignited, but the new insights gained led to a plethora of new high tech products.

Quantum physics was instrumental in everything from solar cells, to lasers, to medical imaging (e.g. MRI) and of course, first and foremost, the transistor. As computers became more powerful, Quantum Chemistry coalesced into an actual field, feeding on the ever increasing computational power. Yet Moore's law proved hardly adequate for its insatiable appetite for the compute cycles required by the underlying quantum numerics.

During Richard Feynman's (too short) life span, he was involved in the military as well as civilian application of quantum mechanics, and his famous "there is plenty of room at the bottom" talk can be read as a programmatic outline of the first Quantum Technology revolution.  This QT 1.0 wave has almost run its course. We made our way to the bottom, but there we encountered entirely new possibilities by exploiting the essential, counter-intuitive non-localities of quantum mechanics.  This takes it to the next step, and again Information Technology is at the fore-front. It is a testament to Feynman's brilliance that he anticipated QT 2.0 as well, when suggesting a quantum simulator for the first time, much along the lines of what D-Wave built.

It is apt and promising that the new wave of quantum technology does not start with a destructive big bang, but an intriguing and controversial black box.



Quantum Computing Road Map

No, we are not there yet, but we are working on it.Qubit spin states in diamond defects don't last forever, but they can last outstandingly long even at room temperature (measured in microseconds which is a long time when it comes to computing).

So this is yet another interesting system added to the list of candidates for potential QC hardware.

Nevertheless, when it comes to the realization of scalable quantum computers, qubits decoherence time may very well be eclipsed by the importance of another time span: 20 years, the length at which patents are valid (in the US this can include software algorithms).

With D-Wave and Google leading the way, we may be getting there faster than most industry experts predicted. Certainly the odds are very high that it won't take another two decades for useable universal QC machines to be built.

But how do we get to the point of bootstrapping a new quantum technology industry? DK Matai addressed this in a recent blog post, and identified five key questions, which I attempt to address below (I took the liberty of slightly abbreviating the questions, please check at the link for the unabridged version).

The challenges DK laid out will require much more than a blog post (or a LinkedIn comment that I recycled here), especially since his view is wider than only Quantum Information science. That is why the following thoughts are by no means comprehensive answers, and very much incomplete, but they may provide a starting point.

1. How do we prioritise the commercialisation of critical Quantum Technology 2.0 components, networks and entire systems both nationally and internationally?

The prioritization should be based on the disruptive potential: Take quantum cryptography versus quantum computing for example. Quantum encryption could stamp out fraud that exploits some technical weaknesses, but it won't address the more dominant social engineering deceptions. On the upside it will also facilitate iron clad cryptocurrencies. Yet, if Feynman’s vision of the universal quantum simulator comes to fruition, we will be able to tackle collective quantum dynamics that are computationally intractable with conventional computers. This encompasses everything from simulating high temperature superconductivity to complex (bio-)chemical dynamics. ETH’s Matthias Troyer gave an excellent overview over these killer-apps for quantum computing in his recent Google talk, I especially like his example of nitrogen fixation. Nature manages to accomplish this with minimal energy expenditure in some bacteria, but industrially we only have the century old Haber-Bosch process, which in modern plants still results in 1/2 ton of CO2 for each ton of NH3. If we could simulate and understand the chemical pathway that these bacteria follow we could eliminate one of the major industrial sources of carbon dioxide.

2. Which financial, technological, scientific, industrial and infrastructure partners are the ideal co-creators to invent, to innovate and to deploy new Quantum technologies on a medium to large scale around the world? 

This will vary drastically by technology. To pick a basic example, a quantum clock per se is just a better clock, but put it into a Galileo/GPS satellite and the drastic improvement in timekeeping will immediately translate to a higher location triangulation accuracy, as well as allow for a better mapping of the earth's gravitational field/mass distribution.

3. What is the process to prioritise investment, marketing and sales in Quantum Technologies to create the billion dollar “killer apps”?

As sketched out above, the real price to me is universal quantum computation/simulation. Considerable efforts have to go into building such machines, but that doesn't mean that you cannot start to already develop software for them. Any coding for new quantum platforms, even if they are already here (as in the case of the D-Wave 2) will involve emulators on classical hardware, because you want to debug and proof your code before submitting it to the more expansive quantum resource. In my mind building such an environment in a collaborative fashion to showcase and develop quantum algorithms should be the first step. To me this appears feasible within an accelerated timescale (months rather than years). I think such an effort is critical to offset the closed sourced and tightly license controlled approach, that for instance Microsoft is following with its development of the LIQUi|> platform.

4. How do the government agencies, funding Quantum Tech 2.0 Research and Development in the hundreds of millions each year, see further light so that funding can be directed to specific commercial goals with specific commercial end users in mind?

This to me seems to be the biggest challenge. The amount of research papers produced in this field is enormous. Much of it is computational theory. While the theory has its merits, I think the governmental funding should try to emphasize programs that have a clearly defined agenda towards ambitious yet attainable goals. Research that will result in actual hardware and/or commercially applicable software implementations (e.g. the UCSB Martinis agenda). Yet, governments shouldn't be in the position to pick a winning design, as was inadvertently done for fusion research where ITER’s funding requirements are now crowding out all other approaches. The latter is a template for how not to go about it.

5. How to create an International Quantum Tech 2.0 Super Exchange that brings together all the global centres of excellence, as well as all the relevant financiers, customers and commercial partners to create Quantum “Killer Apps”?

On a grassroots level I think open source initiatives (e.g. a LIQUiD alternative) could become catalysts to bring academic excellence centers and commercial players into alignment. This at least is my impression based on conversations with several people involved in the commercial and academic realm. On the other hand, as with any open source products, commercialization won’t be easy, yet this may be less of a concern in this emerging industry, as the IP will be in the quantum algorithms, and they will most likely be executed with quantum resources tied to a SaaS offering.


Quantum Computing Coming of Age

Are We There Yet? That's the name of the talk that Daniel Lidar recently gave at Google (h/t Sol Warda who posted this in a previous comment).

Spoiler alert, I will summarize some of the most interesting aspects of this talk as I finally found the time to watch it in its entirety.

The first 15 min you may skip if you follow this blog, he just gives a quick introduction to QC. Actually, if you follow the discussion closely on this blog, then you will find not much news in most of the presentation until the very end, but I very much appreciated the graph 8 minutes in, which is based on this Intel data:

Performance and clock speeds are essentially flat for the last ten years. Only the ability to squeeze more transistors and cores into one chip keeps Moore's law alive (data source Intel Corp.).

Daniel, deservedly, spends quite some time on this, to drive home the point that classical chips have hit a wall.  Moving from Silicon to Germanium will only go so far in delaying the inevitable.

If you don't want to sit through the entire talk, I recommend skipping ahead to the 48 minute mark, when error correction on the D-Wave is discussed. The results are very encouraging, and in the Q&A Daniel points out that this EC scheme could be inherently incorporated into the D-Wave design. Wouldn't be surprised to see this happen fairly soon. The details of the EEC scheme are available at arxiv, and Daniel spends some time on the graph shown below. He is pointing out that, to the extent that you can infer a slope, it looks very promising, as it get flatter as the problems get harder, and the gap between non-EEC and the error corrected annealing widens (solid vs. dashed lines). With EEC I would therefore expect D-Wave machines to systematically outperform simulated annealing.

Number of repetitions to find a solution at least once.
Number of repetitions to find a solution at least once.

Daniel sums up the talk like this:

  1. Is the D-Wave device a quantum annealer?
    • It disagrees with all classical models proposed so far. It also exhibits entanglement. (I.e. Yes, as far as we can tell)
  2.  Does it implement a programmable Ising model in a transverse field and solve optimization problems as promised?
    • Yes
  3. Is there a quantum speedup?
    • Too early to tell
  4. Can we error-correct it and improve its performance?
    • Yes

With regard to hardware implemented qubit EEC, we also got some great news from Martinis' UCSB lab, whom Google drafted for their quantum chip. The latest results have just been published in Nature (pre-print available at arxiv).

Martinis explained the concept in a talk I previously reported on, and clearly the work is progressing nicely. Unlike the EEC scheme for the D-Wave architecture, Martinis' approach is targeting a fidelity that not only will work for quantum annealing, but should also allow for non-trivial gate computing sizes.

Quantum Computing may not have fully arrived yet, but after decades of research we clearly are finally entering the stage where this technology won't be just the domain of theorists and research labs, and at this time, D-Wave is poised to take the lead.



The Year That Was <insert expletive of your choice>

Usually, I like to start a new year on an upbeat note, but this time I just cannot find the right fit. I was considering whether to revisit technology that can clean water - lauding the effort of the Bill Gates foundation came to mind, but while I think this is a great step in the right direction, this water reclaiming technology is still a bit too complex and expensive to become truly transformational and liberating.

At other times, a groundbreaking progress in increasing the efficiency of solar energy would have qualified, the key being that this can be done comparatively cheaply. Alas, the unprecedented drop in the price of oil is not only killing off the fracking industry, but also the economics for alternative energy.  For a planet that has had its fill of CO2, fossil fuel this cheap is nothing but an unmitigated disaster.

So while it was a banner year for quantum computing, in many respects 2014 was utterly dismal, seeing the return of religiously motivated genocide, open warfare in Europe, a resurgence of diseases that could be eradicated by now, and a pandemic that caused knee jerk hysterical reactions that taught us how unprepared we are for these kind of health emergencies. This year was so depressing it makes me want to wail along to my favorite science blogger's song about it (but then again I'd completely ruin it).

And there is another reason to not yet let go of the past, corrections:

With these corrections out of the way I will finally let go of 2014, but with the additional observation that in the world of quantum computing, the new year started very much in the same vein as the old, generating positive business news for D-Wave, which managed to just raise another 29 million dollars, while at the same time still not getting respect from some academic QC researchers.

I.H. Deutsch (please note, not the Deutsch but Ivan) states at the end of this interview:

  1. [1]The D-Wave prototype is not a universal quantum computer.
  2. [2]It is not digital, nor error-correcting, nor fault tolerant.
  3. [3]It is a purely analog machine designed to solve a particular optimization problem.
  4. [4]It is unclear if it qualifies as a quantum device."

No issues with [1]-[3].  But how many times do classical algos have to be ruled out before D-Wave is finally universally accepted as a quantum annealing machine?  This is getting into climate change denying territory. It shouldn't really be that hard to define what makes for quantum computation. So I guess we found a new candidate for D-Wave chief critic, after Scott Aaronson seems to have stepped down for good.

Then again, with a last name like Deutsch, you may have to step up your game to get some name recognition of your own in this field.  And there's no doubt that controversy works.

So 2015 is shaping up to become yet another riveting year for QC news. And just in case you made the resolution that, this year, you will finally try to catch that rainbow, there's some new tech for you.



Update: Almost forgot about this epic fail of popular science reporting at the tail end of 2014.  For now I leave it as an exercise to the reader to spot everything that's wrong with it. Of course most of the blame belongs to PLoS ONE which supposedly practices peer review.

The Race Towards Universal Quantum Computing – Lost in Confusion

If headlines and news articles were all you had to go by when trying to form an opinion about quantum computing, you'd end up with one enormous migraine. For many years now, they have created a constant barrage of conflicting story lines.


For reasons known only to them, science news authors seem to have collectively decided to ignore that there are many competing approaches to quantum computing. This apparent inability to differentiate between architectures and computational models makes for a constant source of confusion, which is then augmented by the challenge to explain the conceptual oddities of quantum computing, such as entanglement.

For instance, most authors, even if they may already know this is wrong, run with the simplest trope about quantum computing, which has been repeated ad nauseum: The pretense that these machines can execute every possible calculation within their input scope in parallel. Hard to imagine a misconception that would be better designed to put up a goalpost that no man-made machine could ever reach.  Scott Aaronson is so incensed by this nonsense that it even inspired the title of his new book. It is truly a sorry state of affairs when even Nature apparently cannot find an author who doesn't fall for it. Elizabeth Gibney's recent online piece on quantum computing was yet another case in point. It starts off promising, as the subtitle is spot on:

After a 30-year struggle to harness quantum weirdness for computing, physicists finally have their goal in reach.

But then the reader's mind is again poisoned with this nonsense:

Where a classical computer has to try each combination in turn, a quantum computer could process all those combinations simultaneously — in effect, carrying out calculations on every possible set of input data in parallel.

Part of the problem is that there exist no other easy concepts that a news author can quickly turn to when trying to offer up an explanation that a casual reader can understand, while at the same time having his mind blown.  ('Wow, every possible combination at the same time!' It's like double rainbow all over again).

Here's my attempt to remedy this situation, a simple example to illustrate the extended capabilities of quantum computing versus  classical machines. The latter are very fast, but when solving a complex puzzle, i.e. finding the lowest number in an unordered list, they have to take one stab at it at a time.  It is like attacking an abstract problem-space the way ancient mariners had to fathom the depth of the sea.  (Gauging the depth with a rope in this manner is the original meaning of the word 'fathom').

You may argue that having several guys fathoming at the same time will give you a 'parallelizing' speed-up, but you would have to be a Luddite to the core to convince yourself that this could ever measure up to echolocation. Just like the latter can perceive data from a larger patch of seafloor, quantum computing can leverage more than just local point data. But this comes at a price: The signal that comes back is not easy to interpret. It depends on the original set-up of the probing signal, and requires subsequent processing.

Like an echolocation system, a quantum computer doesn't magically probe the entire configuration space. It 'sees' more, but it doesn't provide this information in an immediately useful format.

The real challenge is to construct the process in a way that allows you to actually get the answer to the computational problem you are trying to solve. This is devilishly difficult, which is why there are so few quantum algorithms in existence.  There are no simple rules to follow. In order to create one, it requires first and foremost inspiration, and is as much art as science.  That is why, when I learned how Shor's algorithm worked, I was profoundly astounded and awed by the inordinate creativity it must have taken to think up.

Regardless, if this was the only problem with Elizabeth Gibney's article, that would just be par for the course. Yet, while reporting on Google's efforts to build their own quantum computing chip, she manages to not even mention the other quantum computer Google is involved with, and that despite D-Wave publishing in Nature in 2011 and just last year in Nature Communications.

Maybe if she hadn't completely ignored D-Wave, she may have thought to ask Martinis the most pressing question of all: What kind of chip will he build for Google? Everything indicates that it is yet another quantum annealer, but the quotes in the article make it sound as if he was talking about gate computing:

“It is still possible that nature just won't allow it to work, but I think we have a decent chance.”

Obviously he can not possibly be referring to quantum annealing in this context, since that clearly works just fine with fairly large numbers of qubits (as shown in the above mentioned Nature publication).

The current state of news reporting on quantum computing is beyond frustrating. There is a very real and fascinating race underway for the realization of the first commercially useful universal quantum computer. Will it be adiabatic or the gate model?  Are quantum cellular automatons still in the running?

But of course in order to report on this, you must first know about these differences. Apparently, when it comes to science news reporting, this is just too much to expect.

The Nature article also contains this little piece of information:

... the best quantum computers in the world are barely able to do school-level problems such as finding the prime factors of the number 21. (Answer: 3 and 7.)

I guess the fact that the answer is provided gives us a hint as to what level of sophistication the author expects from her audience, which in turn must be terribly confused to see a headline such as "New largest number factored on a quantum device is 56,153".

This is of course not done with Shor's algorithm but via adiabatic computing (and also involves some slight of hand as the algorithm only works for a certain class of numbers and not all integers).

Nevertheless, adiabatic computing seems to have the upper hand when it comes to scaling the problem scope with a limited number of qubits. But the gate model also made some major news last month.  The guinea pig Simon's algorithm (one of the first you will learn when being introduced to the field) has been demonstrated to provide the theoretically predicted quantum speed-up. This is huge news that was immediately translated to the rather misleading headline "Simon's algorithm run on quantum computer for the first time—faster than on standard computer".

Faster in this case means less processing iterations rather than actual elapsed time, but irrespective, having this theoretical prediction confirmed using the fairly recent one-way technique clearly bolsters the case that gate computing can deliver the goods.

No doubt, the race between the  architectures to deliver the first commercial-grade universal quantum computer is on.  It is still wide open, and makes for a compelling story. Now, if we could only get somebody to properly report on it.



Progressing from the God Particle to the Gay Particle

… and other physics and QC news

The ‘god particle’, aka the Higgs boson, received a lot of attention, not that this wasn’t warranted, but I can’t help but suspect that the justification of the CERN budget is partly to blame for the media frenzy.  The gay particle, on the other hand, is no less spectacular - especially since its theoretical prediction by far pre-dates the Higgs boson.  Of course, what has been discovered is, yet again, not a real particle but ‘only’ a pseudo particle similar to the magnetic monopol that has been touted recently.  And as usual, most pop-science write-ups fail entirely to remark on this rather fundamental aspect (apparently the journalists don’t want to bother their audience with these boring details). In case you want to get a more complete picture this colloquium paper gives you an in-depth overview.

On the other hand, a pseudo particle quantum excitation in a 2d superconductor is exactly what the doctor ordered for topological quantum computing, a field that has seen tremendous theoretical progress as it has been generously sponsored by Microsoft. This research entirely hinges on employing these anyon pseudoparticles as a hardware resource, because they have the fantastic property of allowing for inherently decoherence-resistant qubits.  This is as if theoretical computer science would have started writing the first operating system in the roaring twenties of the last century, long before there was a computer or even a transistor, theorizing that a band gap in doped semiconductors should make it possible to build one. If this analogy was to hold, we’d now be at the stage where a band gap has been demonstrated for the first time.  So here's to hoping this means we may see the first anyon-based qubit within the decade.

In the here and now of quantum computing, D-Wave merrily stays the course despite the recent Google bombshell news.  It has been reported that they now have 12 machines operational, used in a hosted manner by their strategic partners (such as 1Qbit).  They also continue to add staff from other superconducting outfits i.e. recently Bill Blake left Cray to join the company as VP of R&D.

Last but not least, if you are interested in physics you would have to live under a rock not to have heard about the sensational news that numerical calculations presumably proofed that black holes cannot form and hence do not exist.  Sabine Hossenfelder nicely deconstructs this.  The long and short of it is that this argument has been going on for a long time, that the equations employed in this research has some counter-intuitive properties, and that the mass integral employed is not all that well-motivated.

Einstein would have been happy if this pans out, after all this research claims to succeed where he failed, but the critical reception of this numerical model has just begun. It may very well be torn apart like an unlucky astronaut in a strongly in-homogeneous gravitational field.

This concludes another quick round-up post. I am traveling this week and couldn't make the time for a longer article, but I should find my way back to a more regular posting schedule next week.

What Defines a Quantum Computer?

Could run Minecraft, but you'd have to be comfortable with getting you blocks as binary strings.

Recently a friend of mine observed in an email discussion "I must admit I find it a little difficult to keep up with the various definitions of quantum computing."

A healthy sign for an enlightened confusion, because this already sets him apart from most people who still have yet to learn about this field, and at best think that all quantum computers are more or less equivalent.

As computers became an integral part of peoples everyday lives, they essentially learn the truth of Turing completeness - even if they have never heard the term.  Now, even a child exposed to various computing devices will quickly develop a sense that whatever one computer can do, another should be able to perform as well, with some allowance for the performance specs of the machine.  Older, more limited machines may not be able to run a current software for compatibility or memory scaling reasons, but there is no difference in principle that would prevent any computer from executing whatever has already been proven to work on another machine.

In the quantum computing domain, things are less clear cut. In my earlier post where I tried my hand at a quantum computing taxonomy, I focused on maturity of the technology, less so on the underlying theoretical model. However, it is the dichotomy in the latter that has been driving the heated controversy of D-Wave's quantumness.

When David Deutsch wrote his seminal paper, he followed in Turing's footsteps, thinking through the consequences of putting a Turing machine into quantum superposition. This line of inquiry eventually gave rise to the popular gate model of quantum computing.

D-Wave, on the other hand, gambled on adiabatic quantum computing, and more specifically, an implementation of quantum annealing.  In preparation for this post I sought to look up these terms in my copy of Nielsen and Chuang's 'Quantum Computation and Quantum Information' textbook.  To my surprise, neither term can be found in the index, and this is the 2010 anniversary edition.  Now, this is not meant to knock the book, and if you want to learn about the gate model I think you won't find a better one. It just goes to show that neither the adiabatic nor annealing approach was on the academic radar when the book was originally written - the first paper on adiabatic quantum computation (Farhi et al.) was published the same year as the first edition of this standard QIS textbook.

At the time it was not clear how the computational powers of the adiabatic approach compared to the quantum gate model. Within a year, Vazirani et al. published a paper that showed that Grover Search can be implemented on this architecture with quantum speed-up.  And although the notoriety of Shore's algorithm overshadows Grover's, the latter has arguably much more widespread technological potential. The Vazirani et al. paper also demonstrated that there will be problem instances that this QC model will not be able to solve efficiently, even though they can be tackled classically.

In 2004 a paper was submitted with a title that neatly sums it up: "Adiabatic Quantum Computation is Equivalent to Standard Quantum Computation" (Lloyd et al.)

If D-Wave had aimed for universal adiabatic quantum computation, maybe it would not have experienced quite as much academic push-back, but they pragmatically went after some lower hanging fruit i.e, quantum annealing. (Notwithstanding, this doesn't stop  MIT's Seth Lloyd from claiming that the company uses his ideas when pitching his own QC venture).

An adiabatic quantum computing algorithm encodes a problem into a cost, or in this case energy function, that is then explored for its absolute minimum. For instance, if you try to solve the traveling salesman problem your cost function would simply be distance traveled for each itinerary. A simple classical gradient descent algorithm over this energy 'landscape' will quickly get stuck in a local minimum (for an analog think of balls rolling down the hilly landscape collecting at some bottom close to were they started and you get the idea).  A truly quantum algorithm, on the other hand, can exploit the 'spooky' quantum properties, such as entanglement and the tunnel effect . In essence, it is as if our rolling balls could somehow sense that there is a deeper valley adjacent to their resting place and "tunnel through" the barrier (hence the name).  This gives these algorithms some spread-out look-ahead capabilities.  But depending on your energy function, this may still not be enough.

The graph bellow illustrates this with a completely made-up cost function, that while entirely oversimplified, hopefully still somewhat captures the nature of the problem. To the extent that the look-ahead capabilities of an adiabatic algorithm are still locally limited, long flat stretches with a relative minimum (a 'plain' in the energy landscape)  can still defeat it. I threw in some arbitrary Bell curves as a stand in for this local quantum 'fuzziness' (the latter incidentally the correct translation for what Heisenberg called his famous relation).

To the left, this fuzzy width doesn't stretch outside the bounds of the flat stretch (or rather, it is negligibly small outside any meaningful neighborhood of this local minimum).

On the other hand, further to the right there is some good overlap between the local minimum closest to the absolute one (overlayed with the bell curve in green).  This is where the algorithm will perform well.troubling_energy_fktD-Wave essentially performs such an algorithm with the caveat that it does not allow completely arbitrary energy functions, but only those that can be shoe-horned into the Ising model.

This was a smart pragmatic decision on their part because this model was originally created to describe solid state magnets that were imagined as little coupled elementary magnetic dipoles, and the latter map perfectly to the superconducting magnetic fluxes that are implemented on the chip.

In terms of complexity, even in a simple classical 2-d toy model, the amount of possible combinations is pretty staggering as the video below nicely demonstrates. The corresponding energy function (Hamiltonian in QM) is surprisingly versatile an can encode a large variety of problems.