All posts by Henning Dekant

Transmuting Waste and Worries Away

The philosopher stone yearned for by Alchemists, was actually not a stone.

Alchemists of old yearned for the philosopher stone, a substance of magical quality that would allow them to transmute other elements into gold.  Nowadays, it would be even more valuable to have this mythical device, since its transmuting power could be harnessed to transform long lasting nuclear waste, if not into gold, then at least into less dangerous isotopes.

Scientists at the Belgium nuclear research center SCK CEN in Mol are working to accomplish exactly that. Lacking a philosopher stone, they are deploying the Swiss army knife of contemporary physics: A particle accelerator to create fast neutrons for the treatment of problematic nuclear waste.

A modern version of the philosopher stone: The MYRRHA (Multi-purpose hYbrid Research Reactor for High-tech Applications) concept allows for industrial scale treatment of nuclear waste. Fast neutrons are produced via nuclear spallation.

By capturing these neutrons, the waste can be transmuted into fission products with much shorter half-lives. The beauty of this concept is that the nuclear waste simultaneously serves as fuel in a sub-critical reactor.  A recent paper on this concept can be found here, and there is also an excellent presentation online (which contains the MYRRHA diagram and radiotoxicity graph).

Not only does this technology allow us to get rid of extremely worrisome long lasting radioactive material, but a nice side effect is that it will also alleviate our energy security worries, as this design is well suited to also use Thorium as fuel.

The following graph illustrates how dramatically this could alter our nuclear waste problem (note: the x-axis is logarithmic).

The requirement for save storage of nuclear waste could be drastically shortened from about 200,000 years to a mere 200, if processed in a suitable spallation reactor.

This flies in the face of conventional wisdom, as well as a world increasingly turned off by conventional nuclear energy in the wake of its latest catastrophe in Fukushima. After all, this new waste treatment still requires a nuclear reactor, and given the track record of this industry is it a risk worth taking?

To answer this, one must take into consideration that the idea of using a particle accelerator as a neutron source for a nuclear reactor is actually quite old, and significantly predates the recent research in Mol.  I first encountered the concept when preparing a presentation for nuclear physics 101 almost twenty years ago.  My subject was differing approaches to inherently safe reactor designs, “safe” in this context defined as the inability of the reactor to engage in a run-away nuclear chain reaction. (The treatment of nuclear waste was not even on the horizon at this point because the necessary reprocessing to separate the waste material from the depleted fuel rods did not exist).

The ratio of surface to volume is key in determining if a a neutron triggers enough follow up reactions to sustain a critical cascading chain reaction.

The idea is simple, design the reactor geometry in such a way that neutrons produced by the reaction don’t have the opportunity to spawn more in follow-up reactions by having them escape the reactor vessel. Then, make up the balance by providing enough neutrons from an accelerator-driven reaction to sustain the nuclear fission process.  Once you pull the plug on the accelerator, the fission reaction cannot sustain itself.  Compare this with the situation in Fukushima or Chernobyl.  The latter was a classic run-away fission chain-reaction, and the biggest problem at Fukushima was that melted down fuel can become critical again (there is some indication that this may actually have happened to some degree).

Will an inherently safe fission reactor design, one that can melt away the stockpiles of most long lasting nuclear waste into something that will only be kept in safe storage for some hundreds of years, sway the environmentally motivated opposition to nuclear technology? Doubtful.  Many will argue that this is too good to be true and point to the fly in the ointment: The fact that reprocessing is essential to make this happen, and that doing this on an increased industrial scale will make accidental releases more likely.  Then there will be the talking point that the nuclear industry will simply use this as an opportunity to establish a Plutonium fuel cycle (one of the purposes that reprocessing technology was originally developed for).  Not to mention the strongly entrenched ideological notion, especially in some European countries, with Germany topping the list, that humanity is simply too immature to handle this kind of technology. In a way, this is the temporal analog to the Not In My Back-Yard (NIMBY) attitude, maybe it should be called the NIMLT principle, as in Not In My LifeTime, let future generations deal with it.

Do you think a core catcher would have helped in Fukushima?

Of course, the nuclear industry did little to earn any trust, when considering what transpired in the wake of the Chernobyl disaster.  In a pathetic display of a “lesson learned” they added “core catchers” to existing blueprints, rather than invest R&D dollars into an inherently safe design. Instead of fixing the underlying problem, they simply tried to make the worst imaginable accident more manageable.  It was as if a car manufacturer whose vehicles rarely, but occasionally, explode was improving the situation in the next model line by adding a bigger fire-extinguisher.
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

In 2006 France changed its laws and regulations in anticipation of this new technology, and now requires that nuclear waste storage sites remain accessible for at least a hundred years so that the waste can be reclaimed. To me this seems eminently reasonable and pragmatic. The notion that we could safely compartmentalize dangerous nuclear waste for many thousands of years and guarantee that it would stay out of the biosphere always struck me as hubris. This technology offers a glimpse at a possible future where humanity may finally gain the capability to clean up its nuclear messes.

Strong words on Weak Measurements

Even the greatest ships can get it wrong.Recently news made the round that the Heisenberg Uncertainty principle was supposedly violated (apparently BBC online news tries to build on its bad track track record for news related to quantum mechanics).

Of course this is utter and complete nonsense, and the underlying papers are quite mundane. But all caveats get stripped out in the reporting until only the wrong sensational twist remains.

Yes, Heisenberg did at some point speculate that the uncertainty relationship may be due to the measurements disturbing the system that is probed, but this idea has long been relegated to the dust bin of science history, and Robert R. Tucci deservedly demolishes it.

I am now breathlessly awaiting the results of this crack team revisiting the Inquisition’s test for witchcraft by attempted drowning.  I bet they’ll find this doesn’t hold up either. Cutting edge science.

 

 

Order from Quantum Discord

Have they no shame?
Even our children are already indoctrinated by popular Disney cartoons to think that Discord is bad. Borivoje Dakić et al. beg to differ.

Conventional hardware thrives on our ability to create precision structures in the micro domain.  Computers are highly ordered and usually (for good reason) regarded as perfectly deterministic in the way that they process information. After all, the error rate of the modern computer is astronomically low.

Even before the advent of quantum computing, it was discovered that this defining feature of our hardware can sometimes be a disadvantage i.e. randomized algorithms can sometimes outperform deterministic ones. Computers actually gain functionality by being able to use randomness as an information processing resource.

Due to their fundamentally probabilistic nature, randomness is always inherent in quantum computing designs. On the other hand, most quantum computing algorithms exploit one of the most fragile, ordered physical states: Entanglement, the peculiar quantum mechanical phenomenon that two systems can be entwined in a common quantum state. It is characterized by perfect correlation of spatially or temporally separated measurements.  The simplest protocol to exploit this feature is the quantum information channel, and it results in some quite surprising and, as is so often the case with quantum mechanics, counter-intuitive results. For instance, if two parties are connected via two very noisy directional channels with zero quantum information capacity, the participants will still be able to establish a qubit flow via entanglement distillation.

It has often been argued that entanglement is at the heart of quantum computing.  This credo has caused quite a bit of grief for the company D-Wave, that lays claim on shipping the first commercially available quantum computer. Although their erstwhile fiercest critic Scot Aaronson has made peace with them, he expressed that he still would like to see a measure for the degree of entanglement that they achieve on their chip.

Is it therefore quite surprising to see papers like this one recently published in Nature Physics that describe Quantum Discord as an optimal resource for quantum information processing. On first glance, some of this seems to be due to semantics.  For instance, John Preskill refers in his overview paper to all non-classical correlations as entanglement, but strictly speaking the term entanglement would never be applied to separable states. However, the paper demonstrates, theoretically as well as experimentally, that separable two qubit states with non-vanishing quantum discord can be found that offer better performance for their test case of quantum teleportation than a fully entangled state:

Experimentally achieved Remote State Preparation payoff for 58 distinct states of a Bloch sphere. Shown are the respective values for the two resource states $ \tilde{p}_w $(red) and $ \tilde{p}_B $ (blue). The dashed lines represent the theoretical expectations. There is a clear separation between the two resource states, which indicates that the separable state $ \tilde{p}_w $ is a better resource for RSP than the entangled state $ \tilde{p}_B $.

This raises the exciting prospect of a new approach to quantum computing that may not require the notoriously difficult preservation of entangled states,  giving hope that there may yet be a viable approach to quantum computing for the rest of us without requiring helium cooling infrastructure. Subsequently, quantum discord has become a research hot topic that spawned a dedicated site that helps keep track of the publications in this area.

At this point it is not obvious (at least to me) what impact these new insights on quantum discord will have in the long run, i.e. how do you develop algorithms to take advantage of this resource, and how will it, for instance, impact the channel capacity for quantum communication? (For a take on the latter see R.Tucci’s latest papers).

What seems clear, though, is that D-Wave has one more good argument to stress the inherent quantumness of their device.

There is a really poorly produced video lecture available on this subject by a co-author of the Quantum Discord paper. (If only the presenter stopped moving so that the camera would not have to be constantly and noisily adjusted). Possibly the point of this dismal production value is to illustrate headache-inducing discord. In that case the University of Oxford certainly succeeded spectacularly.

From the Annals of the Impossible (Experimental Physics)

Updated below.

Photons only slowly emerge from the Sun’s core. Neutrinos just pass through once produced.

Radioactive decay is supposed to be the ultimate random process, immutably governed by an element’s half life and nothing else.  There is no way to determine when a single radioactive atom will spontaneously decay, nor any way to speed-up or slow down the process.  This iron clad certainty has always been the best argument of opponents to conventional nuclear fission power generation, as it means that the inevitable nuclear waste will have to be kept isolated from the biosphere for thousands of years (notwithstanding recent research attempts at stimulated transmutation of some of the longer lasting waste products.)

When plotting the activity of a radioactive sample you expect a graph like the following, a smooth decrease with slight, random variations .

Detected activity of the 137Cs source. The first two points correspond to the beginning of data taking. Dotted lines represent a 0.1% deviation from the exponential trend. Residuals (lower panel) of the measured activity to the exponential fit. Error bars include statistical uncertainties and fluctuations.

(This graph stems from a measurement on the Beta decay of 137CS and was taken deep underground).

What you don’t expect are variations that follow a discernible pattern in the decay rate of a radioactive element, nor any correlation with outside events. But this is exactly what Jere H. Jenkins et al. found:

Plot of measured 36Cl decays taken at the Ohio State University Research Reactor (OSURR). The crosses are the individual data points, and the blue line is an 11-point rolling average. The red curve is the inverse of the square of the Earth–Sun distance. (Error bars are only shown for a limited number of points).

And now this surprising result of the sun’s influence has been corroborated.

The latest research was a collaboration of Stanford and Purdue University with the Geological Survey of Israel, rather reputable research power-houses that make these results difficult to dismiss. Their paper contains the following contour graph for the  measured gamma decay during the day plotted over several years. When comparing this with the same kind of graph of the sun’s inclination during the observed date range the correlation is quite obvious:

Gamma measurements as a function of date and time of day. The
color-bar gives the power, S , of the observed signal (top).
Solar elevation as a function of date and time of day. The color-bar
gives the power, S , of the observed signal (bottom).

There is a video talk on this phenomenon available online.  It takes some patience to sit through, but gives a more complete picture in explaining how these observed patterns can be correlated to the the Sun’s core activity with surprising accuracy.

The evidence for the reality of this effect is surprisingly good, and that is rather shocking. It does not fit into any established theory at this time.

Update and Forums Round-Up

This was the second blog post from this site that has been picked up on slashdot (this was the first one). Last time around WordPress could not handle the load (dubbed slashdot effect). Subsequently I installed the W3 Total Cache plug-in. So before getting back to the physics, I want to use this space to give them a big shout-out.  If you operate a WordPress blog I can highly recommend this plug-in.

This article received almost 30,000 views over two days, the resulting discussions fleshed out some great additional information, but also highlighted what can be easily misread or misconstrued. Top of the list was the notion that this might undermine carbon dating.  For all practical purposes, this can be categorically ruled out. For this to have a noticeable effect, this phenomenon would have to be much more pronounced.  The proposed pattern is just slightly outside the error bars and only imposes a slight variation on top of the regular decay pattern.  Archaeologists should not lose sleep over this. An unintended side-effect was that this attracted creationists. If you adhere to this belief please don’t waste your time commenting here.  This is a site dedicated to physics, and off-topic comments will be treated like spam and deleted.

Another source of confusion was the difference between induced radioactive reactions and spontaneous decay. The latter is what we are supposed to see when measuring the decay of a radioactive isotope in the lab and this is what these papers address. Induced transmution is what can be observed when matter is, for instance, irradiated with neutrons.  This process is pretty well understood and happens as a side effect within nuclear reactors (or even a nuclear warhead before the fission chain reaction overwhelms all other neutron absorption).  The treatment of nuclear waste with a neutron flux is what I hinted at in the last sentence of the first paragraph.  This emerging technology is very exciting and merits its own article, but it is an entirely different story. The news buried in the papers discussed here is that there may be a yet unknown neutrino absorption reaction influencing decay rates that were believed to be only governed by the half-life time interval.  At this point an inverse beta decay is known to exist, but the reaction rate is much smaller than what is required to explain the phenomenon that these papers claim.

The spontaneous decay of a radioactive isotope is regarded as the gold standard for randomness in computer science, and there are some products that rely on this (h/t to Dennis Farr for picking up on this).  I.e. if the decay rate of a lump of radioactive material is no longer governed by the simple function $ N(t) = N_0 2^{-t/t_{1/2}} $ then the probability distribution that these random number generators rely on is no longer valid (the decay constant used in the distribution function at the link relates to the half-life time via $  t_{1/2} = \frac{\ln 2}{\lambda} $.

There were various thoughtful critical comments on the methodology and experimental set-up. The most prominent point that came up was the contention that this was essentially the outcome of data-mining for patterns and then hand-picking results that showed some discernible patterns.  Ironically, this approach is exactly the kind of data processing that spawned a billion dollar industry catering to the commercial Business Intelligence market.  To me, this actually looks like a pretty smart approach to get some more mileage out of old data series (assuming the authors didn’t discard results detrimentally opposed to their hypothesis). The downside of this is the lack of information on the care that went into collecting this data in the first place.  I.e. it was repeatedly pointed out that experimenters should run a control to capture the background radiation and needed to understand and control for the environmental impact on their measuring devices. Relying on third party data means also relying on the reputation of the researchers who conducted the original experiments.

When the original claims were made they triggered follow-up research. Some of it was inconclusive, some of it contradicted the findings and a measurement performed on the Cassini probe’s 238Pu thermonuclear fuel clearly ruled out any sun-distance related influence on that alpha emitter.

Inevitably with controversial results like this the old moniker that “extraordinary claims require extraordinary proof” is repeatedly dragged out.

I always thought this statment was cut off a bit short and should really read: “Extraordinary claims require extraordinary proof and merit extraordinary attention.

Because without the latter, sufficient proof may never be acquired even if it is out there. The experiments required to test this further are not expensive. An easy way to rule out seasonality it to perfom these measurements closer to the equator or have them performed at the same time in a north and south American lab as one slashdot poster suggested.

Ultimately, a Beta emitter measurement on another space probe could lay this to rest and help to conclusively determine if this is a real effect.  It would be very exciting if this can be confirmed but it is certainly not settled at this point.

The Unbearable Lightness of Quantum Mechanics

Updated below.

Gravity and Quantum Mechanics don’t play nice together. Since Einstein’s time, we have two towering theories that have defied all attempts by some very smart people to be reconciled. The Standard Model, built on the foundations of quantum mechanics, has been spectacularly successful. It allows the treatment of masses acquired from the binding energies, and, if the Higgs boson confirmation pans out, accounts for the elemental rest masses – but it does not capture gravity. (The current mass generation models that involve gravity are all rather speculative at this point.)

Einstein’s General Relativity has been equally successful in explaining gravity as innate geometric attributes of space and time itself. It has survived every conceivable test and made spectacular predictions (such as gravity lenses).

On the surface this dysfunctional non-relationship between the two major generally accepted theoretical frameworks seems very puzzling. But it turns out that the nature of this conundrum can be described without recourse to higher math (or star-trek like animations with a mythical sound-track).

Much has been written about the origin of this schism: The historic struggle for the interpretation of Quantum Mechanics, with Einstein and Bohr being the figureheads of the divided physics community at the time. Mendel Sachs (who, sadly, passed away recently) drew the following distinction between the philosophies of the two fractions:

[The Copenhagen Interpretation views] Schroedinger’s matter waves as [complex] waves of probability. The probability was then tied to quantum mechanics as a theory of measurement – made by macro observers on micro-matter. This view was then in line with the positivistic philosophy, whereby the elements of matter are defined subjectively in terms of the measurements of their properties, expressed with a probability calculus. […] Einstein’s idea [was] that the formal expression of the probability calculus that is called quantum mechanics is an incomplete theory of matter that originates in a complete [unified] continuous field theory of matter wherin all of the variables of matter are ‘predetermined’.

(From Quantum Mechanics and Gravity)

These days, the Copenhagen Interpretation no longer reigns supreme, but has some serious competition: E.g. one crazy new kid on the block is the Many World Interpretation.  (For an insightful take on MWI I highly recommend this recent blog post from Scott Aaronson).

But the issue goes deeper than that. No matter what interpretation you favor, one fact remains immutable: Probabilities will always be additive, mathematically they behave in a linear fashion. This, despite its interpretative oddities, makes Quantum Mechanics fairly easy to work with.  On the other hand, general relativity is an intrinsically non-linear theory.  It describes a closed system in which the field, generated by gravitating masses, propagates with finite speed and, in a general, non-equilibrium picture, dynamically affects these masses, in turn rearranging the overall field expression.  (Little wonder Einstein’s field equations only yield to analytical solutions for drastically simplified scenarios).

There is no obvious way to fit Quantum Mechanics, this linear peg, into this decidedly non-linear hole.

Einstein considered Quantum Mechanics a theory that would prove to be an approximation of a fully unified field theory.  He spent his last years chasing after this goal, but never achieved it. Mendel Sachs claims to have succeeded where he failed, and indeed presents some impressive accomplishments, including a way to derived the quantum mechanics structure from extended General Relativity field equations.  What always struck me as odd is how little resonance this generated, although this clearly seems to be an experience shared by other theoretical physicists who work off the beaten path. For instance, Kingsley Jones approaches this same conundrum from a completely different angle in his original paper on Newtonian Quantum Gravity. Yet the citation statistic shows that there was little up-take.

One could probably dedicate an entire blog speculating on why this kind of research does not break into the mainstream, but I would rather end this with the optimistic notion that in the end, new experimental data will hopefully rectify this situation. Although the experiment on a neutral particle Bose-Einstein condensate proposed in Kingsley Jones’ paper has little chance of being performed unless there is some more attention garnered, other experiments to probe the domain where gravity and quantum mechanics intersect get a more lofty treatment: For instance this paper was featured in Nature although its premise is probably incorrect. (Sabine Hossenfelder took Nature and the authors to task on her blog – things get a bit acrimonious in the comment section).

Nevertheless, it is encouraging to see such a high profile interest in these kinds of experiments, chances are we will get it right eventually.

Update

Kingsley Jones (who’s 1995 paper paper I referenced above) has a new blog entry that reflects on the historic trajectory and current state of quantum mechanics.  I think it’s fair to say that he does not subscribe to the Many World Interpretation.

 

 

 

Diamonds are a Qubit’s Best Friend

A Nitrogen Vacancy in a diamond crystal isolates the nuclear spin yet makes it accessible via the hyperfine coupling.

A while ago, I looked into the chance that there would ever be a quantum computer for the rest of us. The biggest obstacle for this is the ultra-low temperature regime that all current quantum computing realizations require.  Although a long shot, I speculated that high temperature super-conductors may facilitate a D-Wave-like approach at temperature regimes that could be achieved with relatively affordable nitrogen cooling. Hoping for quantum computing at room temperature seemed out of the question. But this is exactly the tantalizing prospect that the recent qubit realization within a diamond’s crystal structure is hinting at – no expensive cooling required at all. So, ironically, the future quantum computer for the rest of us may end up being made of diamond.

A qubit requires a near perfectly isolated system i.e. essentially any interaction with the environment destroys the quantum information by randomly transitioning the pure qubit quantum state to a mixed ensemble state (a random superposition mixture of wavefunctions).  The higher the temperature, the more likely are these unwanted interactions via increased Brownian and thermal background radiation, a process know as decoherence.  Solid state qubit realizations are therefore always conducted at temperatures close to absolute zero, and require expensive Helium cooling.  Even under these conditions, qubits realized on superconducting chips don’t survive for very long. Their typical coherence time is measured in micro-seconds. On the other hand, ion-based systems can go for several minutes. While this is an obvious advantage, the challenge in using this design for quantum computing is the ability to initialize these systems into a known state, and the read-out detection sensitivity.  But great strides have been made in this regard, and in the same Science issue that the diamond results were presented, this article has been published that demonstrates a suitable system that exhibits quantum information storage for over 180 seconds.

All these coherence times are very sensitive to even the slightest temperature increase i.e. every millikelvin matters.  (This graph illustrates this for the first commercially available quantum computing design).

Key is to not have too many point defect.

It is in this context that the result of a successful qubit storage in a diamond lattice is almost breathtaking. A coherence time of over one second at room temperatureHarvard is not know for cutting edge experimental quantum computing research, so this result is surprising in more than one respect.

The diamond in question is artificially made and needs to contain some designer irregularites (but not too many of them):  These point defects replace a carbon atom in the diamonds crystal grid with a nitrogen one. If there are no other nitrogen vacancies nearby, the nuclear spin of this atom is very well isolated. Rivaling one would otherwise require close to absolute zero temperatures. On the other hand, this atom’s extra electron can readily interact with EM fields, and this is eactly what the researchers exploited. But there is more to it.

The really intriguing aspect is that this nuclear spin qubit in turn can be made to interact with the spin states of the excess electron, and the coherence times of both can be individually enhanced by suitably tuned laser exposure. The different coupling mechanisms are illustrated in the I came across this popular science write-up that does an excellent job in explaining this (long time readers know that I am rather critical of what usually passes as popular science, so I am delighted when I find something that I can really recommend).

The original paper concludes that additional coherence enhancing techniques could yield  jaw-dropping qubit storage of up to 36 hours at room temperature.

Of course, when everything else fails, physicists can always fall back on this novel approach, a song designed to scare a qubit to never come out of its coherent state:

Lies, Damned Lies, and Quantum Statistics?

Statistics has a bad reputation, and has had for a long time, as demonstrated by Mark Twain’s famous quote[1] that I paraphrased to use as the title of this blog post. Of course physics is supposed to be above the fudging of statistical numbers to make a point.  Well, on second thought, theoretical physics should be above fudging (in the experimental branch, things are not so clear cut).

Statistical physics is strictly about employing all mathematically sound methods to deal with uncertainty. This program turned out to be incredibly powerful, and gave a solid foundation to the thermodynamic laws.  The latter were empirically derived previously, but only really started to make sense once statistical mechanics came into its own, and temperature was understood to be due to the Brownian motion. Incidentally, this was also the field that first attracted a young Einstein’s attention. Among all his other accomplishments, his paper on the matter that finally settled the debate if atoms were for real or just a useful model is often overlooked. (It is mindboggling that within a short span 0f just 40 years (’05-’45) science went from completely accepting the reality of atoms, to splitting them and unleashing nuclear destruction).

Having early on cut his teeth on statistical mechanics, it shouldn’t come as a surprise that Einstein’s last great contribution to physics went back to this field. And it all started with fudging the numbers, in a far remote place, one that Einstein had probably never even heard of.

In the city that is now the capital of Bangladesh, a brilliant but entirely unknown scholar named Satyendra Nath Bose made a mistake when trying to demonstrate to his students that the contemporary theory of radiation was inadequate and contradicted experimental evidence.  It was a trivial mistake, simply a matter of not counting correctly. What added insult to injury, it led to a result that was in accordance with the the correct electromagnetic radiation spectrum. A lesser person may have just erased the blackboard and dismissed the class, but Bose realized that there was some deeper truth lurking beneath the seemingly trivial oversight.

What Bose stumbled upon was a new way of counting quantum particles.  Conventionally, if you have two particles that can only take on two states, you can model them as you would the probabilities for a coin toss. Lets say you toss two coins at the same time; the following table shows the possible outcomes:

    Coin 1
     Head  Tail
 Coin 2  Head  HH  HT
   Tail  TH  TT

It is immediate obvious that if you throw two coins the combination head-head will have a likelihood of one in four.  But if you have the kind of “quantum coins” that Bose stumbled upon then nature behaves rather different.  Nature does not distinguish between the states tails-head and head-tails i.e. the two states marked green in the table.  Rather it just treats these two states as one and the same.

In the quantum domain nature plays the ultimate shell game. If these shells were bosons the universe would not allow you to notice if they switch places.

This means, rather than four possible outcomes in the quantum world, we only have three, and the probability for them is evenly spread, i.e. assigning a one-third chance to our heads-heads quantum coin toss.

Bose found out the hard way that if you try to publish something that completely goes against the  conventional wisdom, and you have to go through a peer review process, your chances of having your paper accepted are almost nil (some things never change).

That’s where Einstein came into the picture.  Bose penned a very respectful letter to Einstein, who at the time was already the most famous scientist of all time, and well on his way to becoming a pop icon (think Lady Gaga of Science).  Yet, against all odds, Einstein read his paper and immediately recognized its merits.  The rest is history.

In his subsequent paper on Quantum Theory of Ideal Monoatomic Gases, Einstein clearly delineated these new statistics, and highlighted the contrast to the classical one that produces unphysical results in the form of an ultraviolet catastrophe. He then applied it to the ideal gas model, uncovering a new quantum state of matter that would only become apparent at extremely low temperatures.

His audacious work set the state for the discovery of yet another fundamental quantum statistic that governs fermions, and set experimental physics on the track to achieving ever lower temperature records in order to find the elusive Bose-Einstein condensate.

This in turn gave additional motivation to the development of better particle traps and laser cooling. Key technologies that are still at the heart of the NIST quantum simulator.

All because of one lousy counting mistake …

[1] Actually the source of the quote is somewhat murky – yet clearly inducted into popular culture thanks to Twain  (h/t to my fact checking commenters).

UPDATE: For some reason Because this site got slashdotted new comments are currently not showing up in my heavily customized WordPress installation – I get to see them in the admin view and can approve them but they are still missing here.

My apologies to everybody who took the time to write a comment! Like most bloggers I love comments so I’ll try to get this fixed ASAP.

 

For the time being, if you want to leave a comment please just use the associated slashdot story.

The comment functionality has been restored.

Explaining Quantum Computing – Blogroll Memory Hole Rescue

So what is quantum computing?

This is the most dreaded question for anybody involved with this field if posed by a friend or relative without a physics background.  When I am too tired or busy to make an honest effort, I usually answer by quoting Feynman’s quip on quantum mechanics (click here to listen to the man himself – the quote appears about 6:45 min into the lecture):

“A lot of people understand the theory of relativity in some way or other.  (..) On the other hand, I think I can safely say that nobody understands quantum mechanics.

The crux of the matter is that Quantum Computing derives its power from precisely the same attributes that make this realm of physics so alien to us.

It’s small comfort that greater minds than mine have been mulling this conundrum. For instance, a while back Scott Aaronson described his struggle to write a column for the New York Times that describes Quantum Computing.

Then there is Michael Nielsen’s take on it, a brilliant write-up illustrating why there is really no reason to expect a simple explanation.

But if this hasn’t utterly discouraged you then I have this little treat from D-Wave’s blog. You need to be willing to tolerate a little math, understanding that an expression like this

$ \displaystyle ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ \sum_{i} x_{i} $

means you are summing over a bunch of variables x indexed by i

$ \displaystyle ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ \sum_{i} x_{i} = x_{0}+x_{1}+x_{2}+ … $

Other than that it just requires you to contemplate Schroedinger’s light switches.  Just as his cat can be thought of as dead and alive at the same time, his light switches are in a superposition of On and Off.  Strictly speaking, D-Wave’s description is specific to their particular adiabatic quantum chip design, but nevertheless, if you get your head around this, you will have a pretty good idea why a Quantum Computer’s abilities go beyond the means of a classical Turing machine.

More Lost Papers

Lost Papers Dropping from the Page of Time

As promised, the translation of the paper that contains Einstein’s last important contribution to modern physics has been completed.   Paul Terlunen graciously provided the initial translation, and my wife Sara helped immensely with the final editing.

Starting point for this effort was the news that Einsteins hand-written manuscript had been re-discovered.  This has been reported in various physics LinkedIn groups, and subsequently there was some interest to look at this paper in an English translation, but there was none readily available.

This is the second part of Einstein’s publications on this topic and I will now start on a translation for the first paper as well.

Nevertheless, if you already have some familiarity with quantum mechanics, you can read this last paper without having worked through the previous one.  It is intriguing to follow the author’s line of thought, and to share in his intrigue with the puzzling nature of the quantum statistics that were uncovered for the first time. To the modern reader, it is also interesting to see in hindsight where Einstein erred when speculating on the nature of the electron gas; At the time, he did not know that his statistics only applied to bosons, and that electrons would turn out to be fermions.

 

 

Vested in IT Security and Losing Sleep Over Quantum Computing?

If so, then this special edition of the IT security magazine hakin9 might be just what the doctor ordered.

Your humble blogger was asked to provide an article and so I contributed the “Who’s Afraid of the Big Bad Quantum Computer?” piece.

It is very smart of the editor staff of hakin9 to approach bloggers, as we already write material for free and an ample sample portfolio of articles can be readily perused. As with this blog I do not benefit in any way financially from this. For me this magazine is just another good venue to get the word out about this exciting technology that is now becoming an IT reality.