Monthly Archives: September 2012

Transmuting Waste and Worries Away

The philosopher stone yearned for by Alchemists, was actually not a stone.

Alchemists of old yearned for the philosopher stone, a substance of magical quality that would allow them to transmute other elements into gold.  Nowadays, it would be even more valuable to have this mythical device, since its transmuting power could be harnessed to transform long lasting nuclear waste, if not into gold, then at least into less dangerous isotopes.

Scientists at the Belgium nuclear research center SCK CEN in Mol are working to accomplish exactly that. Lacking a philosopher stone, they are deploying the Swiss army knife of contemporary physics: A particle accelerator to create fast neutrons for the treatment of problematic nuclear waste.

A modern version of the philosopher stone: The MYRRHA (Multi-purpose hYbrid Research Reactor for High-tech Applications) concept allows for industrial scale treatment of nuclear waste. Fast neutrons are produced via nuclear spallation.

By capturing these neutrons, the waste can be transmuted into fission products with much shorter half-lives. The beauty of this concept is that the nuclear waste simultaneously serves as fuel in a sub-critical reactor.  A recent paper on this concept can be found here, and there is also an excellent presentation online (which contains the MYRRHA diagram and radiotoxicity graph).

Not only does this technology allow us to get rid of extremely worrisome long lasting radioactive material, but a nice side effect is that it will also alleviate our energy security worries, as this design is well suited to also use Thorium as fuel.

The following graph illustrates how dramatically this could alter our nuclear waste problem (note: the x-axis is logarithmic).

The requirement for save storage of nuclear waste could be drastically shortened from about 200,000 years to a mere 200, if processed in a suitable spallation reactor.

This flies in the face of conventional wisdom, as well as a world increasingly turned off by conventional nuclear energy in the wake of its latest catastrophe in Fukushima. After all, this new waste treatment still requires a nuclear reactor, and given the track record of this industry is it a risk worth taking?

To answer this, one must take into consideration that the idea of using a particle accelerator as a neutron source for a nuclear reactor is actually quite old, and significantly predates the recent research in Mol.  I first encountered the concept when preparing a presentation for nuclear physics 101 almost twenty years ago.  My subject was differing approaches to inherently safe reactor designs, "safe" in this context defined as the inability of the reactor to engage in a run-away nuclear chain reaction. (The treatment of nuclear waste was not even on the horizon at this point because the necessary reprocessing to separate the waste material from the depleted fuel rods did not exist).

The ratio of surface to volume is key in determining if a a neutron triggers enough follow up reactions to sustain a critical cascading chain reaction.

The idea is simple, design the reactor geometry in such a way that neutrons produced by the reaction don't have the opportunity to spawn more in follow-up reactions by having them escape the reactor vessel. Then, make up the balance by providing enough neutrons from an accelerator-driven reaction to sustain the nuclear fission process.  Once you pull the plug on the accelerator, the fission reaction cannot sustain itself.  Compare this with the situation in Fukushima or Chernobyl.  The latter was a classic run-away fission chain-reaction, and the biggest problem at Fukushima was that melted down fuel can become critical again (there is some indication that this may actually have happened to some degree).

Will an inherently safe fission reactor design, one that can melt away the stockpiles of most long lasting nuclear waste into something that will only be kept in safe storage for some hundreds of years, sway the environmentally motivated opposition to nuclear technology? Doubtful.  Many will argue that this is too good to be true and point to the fly in the ointment: The fact that reprocessing is essential to make this happen, and that doing this on an increased industrial scale will make accidental releases more likely.  Then there will be the talking point that the nuclear industry will simply use this as an opportunity to establish a Plutonium fuel cycle (one of the purposes that reprocessing technology was originally developed for).  Not to mention the strongly entrenched ideological notion, especially in some European countries, with Germany topping the list, that humanity is simply too immature to handle this kind of technology. In a way, this is the temporal analog to the Not In My Back-Yard (NIMBY) attitude, maybe it should be called the NIMLT principle, as in Not In My LifeTime, let future generations deal with it.

Do you think a core catcher would have helped in Fukushima?

Of course, the nuclear industry did little to earn any trust, when considering what transpired in the wake of the Chernobyl disaster.  In a pathetic display of a "lesson learned" they added "core catchers" to existing blueprints, rather than invest R&D dollars into an inherently safe design. Instead of fixing the underlying problem, they simply tried to make the worst imaginable accident more manageable.  It was as if a car manufacturer whose vehicles rarely, but occasionally, explode was improving the situation in the next model line by adding a bigger fire-extinguisher.
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

In 2006 France changed its laws and regulations in anticipation of this new technology, and now requires that nuclear waste storage sites remain accessible for at least a hundred years so that the waste can be reclaimed. To me this seems eminently reasonable and pragmatic. The notion that we could safely compartmentalize dangerous nuclear waste for many thousands of years and guarantee that it would stay out of the biosphere always struck me as hubris. This technology offers a glimpse at a possible future where humanity may finally gain the capability to clean up its nuclear messes.

Strong words on Weak Measurements

Even the greatest ships can get it wrong.Recently news made the round that the Heisenberg Uncertainty principle was supposedly violated (apparently BBC online news tries to build on its bad track track record for news related to quantum mechanics).

Of course this is utter and complete nonsense, and the underlying papers are quite mundane. But all caveats get stripped out in the reporting until only the wrong sensational twist remains.

Yes, Heisenberg did at some point speculate that the uncertainty relationship may be due to the measurements disturbing the system that is probed, but this idea has long been relegated to the dust bin of science history, and Robert R. Tucci deservedly demolishes it.

I am now breathlessly awaiting the results of this crack team revisiting the Inquisition's test for witchcraft by attempted drowning.  I bet they'll find this doesn't hold up either. Cutting edge science.



Order from Quantum Discord

Have they no shame?
Even our children are already indoctrinated by popular Disney cartoons to think that Discord is bad. Borivoje Dakić et al. beg to differ.

Conventional hardware thrives on our ability to create precision structures in the micro domain.  Computers are highly ordered and usually (for good reason) regarded as perfectly deterministic in the way that they process information. After all, the error rate of the modern computer is astronomically low.

Even before the advent of quantum computing, it was discovered that this defining feature of our hardware can sometimes be a disadvantage i.e. randomized algorithms can sometimes outperform deterministic ones. Computers actually gain functionality by being able to use randomness as an information processing resource.

Due to their fundamentally probabilistic nature, randomness is always inherent in quantum computing designs. On the other hand, most quantum computing algorithms exploit one of the most fragile, ordered physical states: Entanglement, the peculiar quantum mechanical phenomenon that two systems can be entwined in a common quantum state. It is characterized by perfect correlation of spatially or temporally separated measurements.  The simplest protocol to exploit this feature is the quantum information channel, and it results in some quite surprising and, as is so often the case with quantum mechanics, counter-intuitive results. For instance, if two parties are connected via two very noisy directional channels with zero quantum information capacity, the participants will still be able to establish a qubit flow via entanglement distillation.

It has often been argued that entanglement is at the heart of quantum computing.  This credo has caused quite a bit of grief for the company D-Wave, that lays claim on shipping the first commercially available quantum computer. Although their erstwhile fiercest critic Scot Aaronson has made peace with them, he expressed that he still would like to see a measure for the degree of entanglement that they achieve on their chip.

Is it therefore quite surprising to see papers like this one recently published in Nature Physics that describe Quantum Discord as an optimal resource for quantum information processing. On first glance, some of this seems to be due to semantics.  For instance, John Preskill refers in his overview paper to all non-classical correlations as entanglement, but strictly speaking the term entanglement would never be applied to separable states. However, the paper demonstrates, theoretically as well as experimentally, that separable two qubit states with non-vanishing quantum discord can be found that offer better performance for their test case of quantum teleportation than a fully entangled state:

Experimentally achieved Remote State Preparation payoff for 58 distinct states of a Bloch sphere. Shown are the respective values for the two resource states $ \tilde{p}_w $(red) and $ \tilde{p}_B $ (blue). The dashed lines represent the theoretical expectations. There is a clear separation between the two resource states, which indicates that the separable state $ \tilde{p}_w $ is a better resource for RSP than the entangled state $ \tilde{p}_B $.

This raises the exciting prospect of a new approach to quantum computing that may not require the notoriously difficult preservation of entangled states,  giving hope that there may yet be a viable approach to quantum computing for the rest of us without requiring helium cooling infrastructure. Subsequently, quantum discord has become a research hot topic that spawned a dedicated site that helps keep track of the publications in this area.

At this point it is not obvious (at least to me) what impact these new insights on quantum discord will have in the long run, i.e. how do you develop algorithms to take advantage of this resource, and how will it, for instance, impact the channel capacity for quantum communication? (For a take on the latter see R.Tucci's latest papers).

What seems clear, though, is that D-Wave has one more good argument to stress the inherent quantumness of their device.

There is a really poorly produced video lecture available on this subject by a co-author of the Quantum Discord paper. (If only the presenter stopped moving so that the camera would not have to be constantly and noisily adjusted). Possibly the point of this dismal production value is to illustrate headache-inducing discord. In that case the University of Oxford certainly succeeded spectacularly.

From the Annals of the Impossible (Experimental Physics)

Updated below.

Photons only slowly emerge from the Sun's core. Neutrinos just pass through once produced.

Radioactive decay is supposed to be the ultimate random process, immutably governed by an element's half life and nothing else.  There is no way to determine when a single radioactive atom will spontaneously decay, nor any way to speed-up or slow down the process.  This iron clad certainty has always been the best argument of opponents to conventional nuclear fission power generation, as it means that the inevitable nuclear waste will have to be kept isolated from the biosphere for thousands of years (notwithstanding recent research attempts at stimulated transmutation of some of the longer lasting waste products.)

When plotting the activity of a radioactive sample you expect a graph like the following, a smooth decrease with slight, random variations .

Detected activity of the 137Cs source. The first two points correspond to the beginning of data taking. Dotted lines represent a 0.1% deviation from the exponential trend. Residuals (lower panel) of the measured activity to the exponential fit. Error bars include statistical uncertainties and fluctuations.

(This graph stems from a measurement on the Beta decay of 137CS and was taken deep underground).

What you don't expect are variations that follow a discernible pattern in the decay rate of a radioactive element, nor any correlation with outside events. But this is exactly what Jere H. Jenkins et al. found:

Plot of measured 36Cl decays taken at the Ohio State University Research Reactor (OSURR). The crosses are the individual data points, and the blue line is an 11-point rolling average. The red curve is the inverse of the square of the Earth–Sun distance. (Error bars are only shown for a limited number of points).

And now this surprising result of the sun's influence has been corroborated.

The latest research was a collaboration of Stanford and Purdue University with the Geological Survey of Israel, rather reputable research power-houses that make these results difficult to dismiss. Their paper contains the following contour graph for the  measured gamma decay during the day plotted over several years. When comparing this with the same kind of graph of the sun's inclination during the observed date range the correlation is quite obvious:

Gamma measurements as a function of date and time of day. The
color-bar gives the power, S , of the observed signal (top).
Solar elevation as a function of date and time of day. The color-bar
gives the power, S , of the observed signal (bottom).

There is a video talk on this phenomenon available online.  It takes some patience to sit through, but gives a more complete picture in explaining how these observed patterns can be correlated to the the Sun's core activity with surprising accuracy.

The evidence for the reality of this effect is surprisingly good, and that is rather shocking. It does not fit into any established theory at this time.

Update and Forums Round-Up

This was the second blog post from this site that has been picked up on slashdot (this was the first one). Last time around WordPress could not handle the load (dubbed slashdot effect). Subsequently I installed the W3 Total Cache plug-in. So before getting back to the physics, I want to use this space to give them a big shout-out.  If you operate a WordPress blog I can highly recommend this plug-in.

This article received almost 30,000 views over two days, the resulting discussions fleshed out some great additional information, but also highlighted what can be easily misread or misconstrued. Top of the list was the notion that this might undermine carbon dating.  For all practical purposes, this can be categorically ruled out. For this to have a noticeable effect, this phenomenon would have to be much more pronounced.  The proposed pattern is just slightly outside the error bars and only imposes a slight variation on top of the regular decay pattern.  Archaeologists should not lose sleep over this. An unintended side-effect was that this attracted creationists. If you adhere to this belief please don't waste your time commenting here.  This is a site dedicated to physics, and off-topic comments will be treated like spam and deleted.

Another source of confusion was the difference between induced radioactive reactions and spontaneous decay. The latter is what we are supposed to see when measuring the decay of a radioactive isotope in the lab and this is what these papers address. Induced transmution is what can be observed when matter is, for instance, irradiated with neutrons.  This process is pretty well understood and happens as a side effect within nuclear reactors (or even a nuclear warhead before the fission chain reaction overwhelms all other neutron absorption).  The treatment of nuclear waste with a neutron flux is what I hinted at in the last sentence of the first paragraph.  This emerging technology is very exciting and merits its own article, but it is an entirely different story. The news buried in the papers discussed here is that there may be a yet unknown neutrino absorption reaction influencing decay rates that were believed to be only governed by the half-life time interval.  At this point an inverse beta decay is known to exist, but the reaction rate is much smaller than what is required to explain the phenomenon that these papers claim.

The spontaneous decay of a radioactive isotope is regarded as the gold standard for randomness in computer science, and there are some products that rely on this (h/t to Dennis Farr for picking up on this).  I.e. if the decay rate of a lump of radioactive material is no longer governed by the simple function $ N(t) = N_0 2^{-t/t_{1/2}} $ then the probability distribution that these random number generators rely on is no longer valid (the decay constant used in the distribution function at the link relates to the half-life time via $  t_{1/2} = \frac{\ln 2}{\lambda} $.

There were various thoughtful critical comments on the methodology and experimental set-up. The most prominent point that came up was the contention that this was essentially the outcome of data-mining for patterns and then hand-picking results that showed some discernible patterns.  Ironically, this approach is exactly the kind of data processing that spawned a billion dollar industry catering to the commercial Business Intelligence market.  To me, this actually looks like a pretty smart approach to get some more mileage out of old data series (assuming the authors didn't discard results detrimentally opposed to their hypothesis). The downside of this is the lack of information on the care that went into collecting this data in the first place.  I.e. it was repeatedly pointed out that experimenters should run a control to capture the background radiation and needed to understand and control for the environmental impact on their measuring devices. Relying on third party data means also relying on the reputation of the researchers who conducted the original experiments.

When the original claims were made they triggered follow-up research. Some of it was inconclusive, some of it contradicted the findings and a measurement performed on the Cassini probe's 238Pu thermonuclear fuel clearly ruled out any sun-distance related influence on that alpha emitter.

Inevitably with controversial results like this the old moniker that "extraordinary claims require extraordinary proof" is repeatedly dragged out.

I always thought this statment was cut off a bit short and should really read: "Extraordinary claims require extraordinary proof and merit extraordinary attention."

Because without the latter, sufficient proof may never be acquired even if it is out there. The experiments required to test this further are not expensive. An easy way to rule out seasonality it to perfom these measurements closer to the equator or have them performed at the same time in a north and south American lab as one slashdot poster suggested.

Ultimately, a Beta emitter measurement on another space probe could lay this to rest and help to conclusively determine if this is a real effect.  It would be very exciting if this can be confirmed but it is certainly not settled at this point.