Progressing from the God Particle to the Gay Particle

… and other physics and QC news

The ‘god particle’, aka the Higgs boson, received a lot of attention, not that this wasn’t warranted, but I can’t help but suspect that the justification of the CERN budget is partly to blame for the media frenzy.  The gay particle, on the other hand, is no less spectacular – especially since its theoretical prediction by far pre-dates the Higgs boson.  Of course, what has been discovered is, yet again, not a real particle but ‘only’ a pseudo particle similar to the magnetic monopol that has been touted recently.  And as usual, most pop-science write-ups fail entirely to remark on this rather fundamental aspect (apparently the journalists don’t want to bother their audience with these boring details). In case you want to get a more complete picture this colloquium paper gives you an in-depth overview.

On the other hand, a pseudo particle quantum excitation in a 2d superconductor is exactly what the doctor ordered for topological quantum computing, a field that has seen tremendous theoretical progress as it has been generously sponsored by Microsoft. This research entirely hinges on employing these anyon pseudoparticles as a hardware resource, because they have the fantastic property of allowing for inherently decoherence-resistant qubits.  This is as if theoretical computer science would have started writing the first operating system in the roaring twenties of the last century, long before there was a computer or even a transistor, theorizing that a band gap in doped semiconductors should make it possible to build one. If this analogy was to hold, we’d now be at the stage where a band gap has been demonstrated for the first time.  So here’s to hoping this means we may see the first anyon-based qubit within the decade.

In the here and now of quantum computing, D-Wave merrily stays the course despite the recent Google bombshell news.  It has been reported that they now have 12 machines operational, used in a hosted manner by their strategic partners (such as 1Qbit).  They also continue to add staff from other superconducting outfits i.e. recently Bill Blake left Cray to join the company as VP of R&D.

Last but not least, if you are interested in physics you would have to live under a rock not to have heard about the sensational news that numerical calculations presumably proofed that black holes cannot form and hence do not exist.  Sabine Hossenfelder nicely deconstructs this.  The long and short of it is that this argument has been going on for a long time, that the equations employed in this research has some counter-intuitive properties, and that the mass integral employed is not all that well-motivated.

Einstein would have been happy if this pans out, after all this research claims to succeed where he failed, but the critical reception of this numerical model has just begun. It may very well be torn apart like an unlucky astronaut in a strongly in-homogeneous gravitational field.

This concludes another quick round-up post. I am traveling this week and couldn’t make the time for a longer article, but I should find my way back to a more regular posting schedule next week.

8 thoughts on “Progressing from the God Particle to the Gay Particle

    1. Things are really getting more real much quicker, aren’t they? Topological computing was always way out there, lots of good theory but the essential building block was missing. It’s still a long way but now it’s something that feels much more tangible.

        1. In isolation these are great achievements but I just don’t see how they are going to scale this up (i.e. getting several qubits across a chip to interact in a controlled fashion). That’s why I am still not very bullish on quantum dots for QC.

    1. Good question. The way I think about it, this is just another way of sampling which behaves slightly different than in classical systems. In my view the differences arise from the fact that classical probability can be understood as a sub-set of the broader quantum one. That’s why I always regarded weak measurements as not particularly interesting.

      At first glance the weak value argument in the first paper seems sound, I think they really pin-point a mistake there, but I don’t think that this mistake applies to most of the sampling that goes under the name ‘weak measurements’ (I may be wrong though, as I really haven’t paid much attention to this field).

Comments are closed.