Category Archives: Blogroll Rescue

Nuclear Confusion Versus Dead-End Certainty?

While I am working on my next  blog post, this excellent update on the state of fusion research from the Polywell Blog author John Smith shouldn’t go unnoticed.  He makes a strong case that the US is neglecting promising avenues towards self-sustained nuclear fusion as the ITER cost keeps on skyrocketing.  This echoes a similar sentiment that I heard when visiting General Fusion. Nevertheless, I think quitting ITER completely, as John recommends, is unwise.

The US already only has observer status at CERN, so bailing on ITER would sideline the American physics community even more. Despite the cost overruns and irrespective of its commercialisation prospects, ITER will make for one of the most advanced testbeds for plasma physics.  Should the US really shut itself out of having prime access to this machine once it is operational?

John’s post provides an excellent round-up of the various approaches to fusion, and mentions the damage that cold fusion inflicted on the field, a story that deserves a separate article. But there is another plasma phenomenon that some hope could be exploited for nuclear fusion that goes unmentioned in John’s otherwise exhaustive post. It shares some communality with the dubious cold fusion experiments: Abysmally bad replicability that severely damaged the reputation of one of the lead researchers in the field. This speculative approach to fusion was recently prominently featured in a surprisingly well researched gawker article (h/t Ed B.). It mentions some private outfits that are hanging their hat on sonoluminescence, and since the latter phenomenon is, after all, an actual plasma creating micro cavitation, these companies don’t deserve to be lumped in with the more shady cold fusion hustlers.

However, it is quite apparent that none of these can produce neutrons at a significant rate, unlike PNL’s High Yield Neutron Generator, an already commercially valuable technology. So there clearly is not much reason to get too excited about sonoluminescence unless one of the companies invested in this approach could replicate this feat.

Phoenix Nuclear Lab’s High Yield Neutron Generator, a piece of fusion technology you can buy today. It offers much cleaner and less messy source for neutrons than any fission based approach (it also avoids the proliferation headaches that come with the latter).

On balance, the influx of private money into nuclear fusion start-ups is the story here, one that gives hope that humanity may find a way to break its self-defeating fossil fuel habit within our lifetime.

 

 

 

Breaking Science News on the Blogosphere?

Update Below

Wrong!

It has been my long held belief that science media needs an additional corrective in the form of blogs, similar to the development we’ve seen in the political sphere.  Now it seems the news that the BICEP results, that were heralded as Nobel price worthy, may be wrong, originated with this blog post.

Certainly big enough news to interrupt my blog hiatus.

Maybe sometimes some results are really too good to be true, and this may turn out to be this year’s version of the faster than light neutrinos.

Update

As was to be expected there is now some push-back against these claims, and the authors stand by the paper.

It also illustrates that science is a bit like sausages, sometimes you don’t really want to know exactly what went into it.  At least that how I felt when I learned that the source for this controversy is the way that data has been scrapped from a PDF copy. One could hardly make a better case for why we need a good Open Science infrastructure.

Irrespective my favorite physics blogger took on the results and puts them into context.

 

Time for Another Blogroll Memory Hole Rescue

CRA
The Canadian Revenue Agency is the equivalent to the IRS down South. They owe me money and always make me work to get it.

Unlike the US, tax returns in Canada are due by the end of April, but because of the Heartbleed bug, Revenue Canada had to take down electronic filing for a while, so the deadline has been extended a bit.  It seems I may need the extra days as life is keeping me extraordinarily busy. Saturday morning is usually my blogging time, but this weekend I had to look after my kids (my wife Sara was performing Beethoven 9th with the Peterborough Symphony) and today my oldest daughter turned seven, filling the day with Zoo visits and birthday cakes.

At least the bug bought me some more time.

So in order to not completely abandon this blog, a couple of links to other outstanding science musing are in order. To that end I would like to highlight some posts of Sabine Hossenfelder, a blogging physicist professor of theoretical physics currently teaching in Sweden. Her most recent post discusses some of the structural problems in Academia, which in reality is nothing like the commonly held notion of a utopian ivory tower (rather, the tower stands and becomes ever more compartmentalized, but there is nothing utopian about it).

Her post on the Problem of Now makes a nice primer for a long-planned future post of mine on Julian Barbour’s End of Time, because arguably he took “Einstein’s Blunder” and ran with it as far as one can take it.  The man’s biography also ties back to the dilemma of academia, as it really doesn’t  allow much space for such deep, and out of the mainstream, research programs.

Last but not least, I really enjoyed this rant.

And I probably should mention that Sabine also knows how to sing. It obviously takes a physicist to really muster the emotional impact of the agonizing ongoing demise of SUSY.

 

 

Blog Memory Hole Rescue – The Fun is Real

It seems that work and life is conspiring to leave me no time to finish my write-up on my General Fusion visit.  Started it weeks ago but still I am not ready to hit the publish button on this piece.

memory_hole

In the meantime I highly recommend the following blog that I came across.  It covers very similar topics than the ones here, and also shares a similar outlook.  For instance, this article beautifully sums up why I never warmed up to Everett’s Multiverse interpretation (although I have to admit reading Julian Barbour’s End of Time softened my stance a bit – more on this later).

The ‘Fun Is Real’ blog is a cornucopia of good physics writing and should provide many hours of thought-provoking reading material to bridge over the dearth of my current posting schedule.

On a side note, given that this goes to the core of the topic I write about on this blog, the following news should not go unmentioned:  Australian researchers reportedly have created a cluster state of 10,000 entangled photonic qubits (h/t Raptis T.).

This is magnitudes more than has been previously reported. Now if they were to manage to get some quantum gates applied to them we’d be getting somewhere.

The Meaning of Wave Mechanics and the Mongol Physics Project

Mongols knew that a horse was either dead or alive, but never in a state of superposition between the twain.

Kingsley Jones, an Australian theoretical physicist turned entrepreneur, recently introduced what he dubs Mongol physics, a bold undertaking to “fix” QM and QED.

The name is aptly chosen, because if he succeeds in this, most of academic physics will be as taken by surprise as Europe was when the Golden Horde arrived. After all, physics doesn’t perceive these theories as defective, despite the enduring confusion as to what QM interpretation makes the most sense.

Kingsley dubs Erwin Schrödinger “Mongul #1” and there is a good reason for this. Having just received my copy of his collected papers, the first thing I came across was this little gem that I include below. The fact that it reads just as relevant 60 years later speaks volumes.  The only thing that has changed since then is that clever ways were found to deal with the runaway infinities in QED, so that accurate numbers could be forced out of it. Schrödinger knew better than to hinge any of his arguments on these major technical challenges at the time.  Rather, the article details his discomfort with the Copenhagen interpretation based on very fundamental considerations.  Makes me wonder how he’d feel about the fact that his cat in a box, that he made up to mock the status quo, entered popular culture as a supposedly valid illustration of quantum weirdness.

(Austrian copyright protection expires after 70 years, yet due to the fact that scans of the article are freely accessible at this University of Vienna site, I assume this text to be already placed in the public domain and hence free for online reproduction.  Please note this is not a translation. Schrödinger was fluent in several languages and originally penned this in English)  

THE MEANING OF WAVE MECHANICS
by Erwin Schrödinger
(For the July Colloquium, Dublin 1952)

Louis de Broglie’s great theoretical discovery of the wave phenomenon associated with the electron was followed within a few years, on the one hand by incontrovertible experimental evidence (based on interference patterns) of the reality of the de Broglie waves (Davisson and Germer, G. P. Thomson), and on the other hand by a vast generalization of his original ideas, which embraces the entire domain of physics and chemistry, and may be said to hold the field today along the whole line, albeit not precisely in the way de Broglie and his early followers had intended.

For it must have given to de Broglie the same shock and disappointment as it gave to me, when we learnt that a sort of transcendental, almost psychical interpretation of the wave phenomenon had been put forward, which was very soon hailed by the majority of leading theorists as the only one reconcilable with experiments, and which has now become the orthodox creed, accepted by almost everybody, with a few notable exceptions. Our disappointment consisted in the following. We had believed. that the eigenfrequencies of the wave phenomenon, which were in exact numerical agreement with the, until then so called, energy levels, gave a rational understanding of the latter. We had confidence that the mysterious “fit and jerk theory” about the jump-like transition from one energy level to another was now ousted. Our wave equations could be expected to describe any changes of this kind as slow and actually describable processes. This hope was not informed by personal predilection for continuous description, but if anything, by the wish for any kind of description at all of these changes. It was a dire necessity. To produce a coherent train of light, waves- of 100 cm length and more, as is observed in fine spectral lines, takes a time comparable with the average interval between transitions. The transition must be coupled with the production of the wave train. Hence if one does not understand the transition, but only understands the “stationary states”, one understands nothing. For the emitting system is busy all the time in producing the trains of light waves, it has no time left to tarry in the cherished “stationary states”, except perhaps in the ground state.

Another disconcerting feature of the probability interpretation was and is that the wave function is deemed to change in two entirely distinct fashions; it is thought to be governed by the wave equation as long as no observer interferes with the system, but whenever an observer makes a measurement, it is deemed to change into an eigenfunction of that eigenvalue of the associated operator that he has measured. I know only of one timid attempt (J. von Neumann in his well known book) to put this “change by measurement” to the door of a perturbing operator introduced by the measurement, and thus to have it also controlled solely by the wave equation. But the idea was not pursued, partly because it seemed unnecessary to those who were prepared to swallow the orthodox tenet, partly because it could hardly be reconciled with it. For in many cases the alleged change involves an actio in distans, which would contradict a firmly established principle, if the change referred to a physical entity. The non-physical character of the wave function (which is sometimes said to embody merely our knowledge) is even more strongly emphasized by the fact that according to the orthodox view its change by measurement is dependent on the observer’s taking cognizance of the result. Moreover the change holds only for the observer who does. If you are present, but are not informed of the result, then for you even if you have the minutest knowledge both of the wave function before the measurement and of the appliances that were used, the changed wave function is irrelevant, not existing, as it were; for you there is, at best, a wave function referring to the measuring appliances plus the system under consideration, a wave function in which the one adopted by the knowing observer plays no distinguished role.

M. de Broglie, so I believe, disliked the probability interpretation of wave mechanics as much as I did. But very soon and for a long period one had to give up opposing it, and to accept it as an expedient interim solution. I shall point out some of the reasons why the originally contemplated alter-native seemed deceptive and, after all, too naive. The points shall be numbered for later reference; the illustrating examples are representative of wide classes.

  • i) As long as a particle, an electron or proton etc., was still believed to be a permanent, individually identifiable entity, it could not adequately be pictured in our mind as a wave parcel. For as a rule, apart from artificially constructed and therefore irrelevant exceptions, no wave parcel can be indicated which does not eventually disperse into an ever increasing volume of space.
  • .
  • ii) The original wave-mechanical model of the hydrogen atom is not self-consistent. The electronic cloud effectively shields the nuclear charge towards outside, making up a neutral whole, but is inefficient inside; in computing its structure its own field that it will produce must not be taken into account, only the field of the nucleus.
  • .
  • iii) It seemed impossible to account for e.g. Planck’s radiation formula without assuming that a radiation oscillator (proper mode of the hohlraum) can only have energies nhν, with n an integer (or perhaps a half odd integer). Since this holds in all cases of thermodynamic equilibrium that do not follow the classical law of equipartition we are thrown back to the discrete energy states with abrupt transitiona between them, and thus to the probability interpretation.
  • .
  • iv) Many non-equilibrium processes suggest even more strongly the “transfer of whole quanta”; the typical, often quoted example is the photoelectric effect, one of the pillars of Einstein’s hypothesis of light quanta in 1905.

All this was known 25 years ago, and abated the hopes of “naive” wave-mechanista. The now orthodox view about the wave function as “probability amplitude” was put forward and was worked out into a scheme of admirable logical consistency. Let us first review the situation after the state of knowledge we had then. The view suggested by (iii) and (iv), that radiation oscillators, electrons and similar constituents of observable systems always find themselves at one of their respective energy levels except when they change abruptly to another one handing the balance over to, or receiving it from, some other system, this view, so I maintain, is in glaring contradiction with the above mentioned scheme in spite of the admirable logical self-consistency of the latter. For one of the golden rules of this scheme is, that any observable is always found at one of its eigenvalues, when you measure it, but that you must not say that it has any value, if you do not measure it. To attribute sharp energy values to all those constituents, whose energies we could not even dream of measuring (except in a horrible nightmare), is not only gratuitous but strictly forbidden by this rule.

Now let us review the situation as it is today. Two new aspects have since arisen which I consider very relevant for reconsidering the interpretation. They are intimately connected. They have not turned up suddenly. Their roots lie far back, but their bearing was only very gradually recognized.

I mean first the recognition that the thing which has always been called a particle and, on the strength of habit, is still called by some such name is, whatever it may be, certainly not an individually identifiable entity. I have dwelt on this point at length elsewhere [“Endeavour”, Vol.IX, Number 35, July 1950; reprinted in the Smithsonian Institution Report for 1950, pp. 183, – 196; in German “Die Pyramide”, Jan. and Feb. 1951 (Austria)]. The second point is the paramount importance of what is sometimes called “second quantization”.

To begin with, if a particle is not a permanent entity, then of the four difficulties labelled above, (i) is removed. As regards (ii), the quantization of de Broglie’s waves around a nucleus welds into one comprehensive scheme all the 3n-dimensional reprasentations that I had. proposed for the n-body problems. It is not an easy scheme, but it is logically clear and it can be so framed that only the mutual Coulomb energies enter.

As regards (iii) – keeping to the example of black body radiation – the situation is this. If the radiation is quantized each radiation oscillator (proper mode) obtains the frequencies or levels nhν. This is sufficient to produce Planck’s formula for the radiation in a cavity surrounded by a huge heat bath. I mean to say, the level scheme suffices: it is not necessary to assume that each oscillator is at one of its levels, which is absurd from any point of view. The same holds for all thermodynamical equilibria. I have actually given a general proof of this in the last of my “Collected Papers” (English version: Blackie and Son, Glasgow 1928). A better presentation is added as an appendix to the forthcoming 2nd impression of “Statistical Thermodynamics” (Cambridge University Press).

Under (iv) we alluded to a vast range of phenomena purported to be conclusive evidence for the transfer of whole quanta. But I do not think they are, provided only that one holds on to the wave aspect throughout the whole process. One must, of course, give up thinking of e.g. an electron as of a tiny speck of something moving within the wave train along a mysterious unknowable path. One must regard the “observation of an electron” as an event that occurs within a train of de Broglie waves when a contraption is interposed in it which by its very nature cannot but answer by discrete responses: a photographic emulsion, a luminescent screen, a Geiger counter. And one must, to repeat this, hold on to the wave aspect throughout. This includes, that the equations between frequencies and frequency differences, expressing the resonance condition that governs wave mechanics throughout, must not be multiplied by Planck’s constant h and then interpreted as tiny energy balances of microscopic processes between tiny specks of something that have, to say the least, no permanent existence.

This situation calls for a revision of the current interpretation, which involves computing transition probabilities from level to level, and disregards the fact that the wave equation, with few exceptions if any, indicates nothing of the sort, but leads each of the reacting systems into a state composed of a wide spread of energy eigenstates. To assume that the system actually leaps into just one of them which is selected by “playing dice”, as it were, is not only gratuitous, but as was pointed out above, contradicts in most cases even the current interpretation. These inconsistencies will be avoided by returning to a wave theory that is not continually abrogated by dice-miracles; not of course to the naive wave theory of yore, but to a more sophisticated one, based on second quantization and the non-individuality of “particles”. Originating from contraptions that by their very nature cannot but give a discrete, discontinuous response, the probability aspect has unduly entered the fundamental concepts and has domineeringly dictated the basic structure of the present theory.

In giving it up we must no longer be afraid of losing time-honoured atomism. It has its counterpart in the level-scheme (of second quantization) and nowhere else. It may be trusted to give atomism its due, without being aided by dice-playing.

To point here to the general failure of the present theory to obtain finite transition probabilities and finite values of the apparent mass and charge, might seem a cheap argument and a dangerous one at that. The obvious retort would be: Can you do better, sir? Let me frankly avow that I cannot. Still I beg to plead that I am at the moment groping for my way almost single-handed, as against a host of clever people doing their best along the recognized lines of thought.

But let me still draw attention to a point that is seldom spoken of. I called the probability interpretation a scheme of admirable logical consistency. Indeed it gives us a set of minute prescriptions, not liable ever to be involved in contradiction, for computing the probability of a particular outcome of any intended measurement, given the wave function and the hermitian operator associated with that particular measuring device. But, of course, an abstract mathematical theory cannot possibly indicate the rules for this association between operators and measuring devices. To describe one of the latter is a long and circumstantial task for the experimentalist. Whether the device which he recommends really corresponds to the operator set up by the theorist, is not easy to decide. Yet this is of paramount importance. For a measuring appliance means now much more than it did before the advent of quantum mechanics and of its interpretation which I am opposing here. It has a physical influence on the object; it is deemed to press it infallibly into one of the eigenstates of the associated operator. If it fails to put it in an eigenstate belonging to the value resulting from the measurement, the  latter is quantum-mechanically not repeatable. I cannot help feeling that the precariousness of the said association makes that beautiful, logically consistent theoretical scheme rather void. At any rate its contact with actual laboratory work is very different from what one would expect from its fundamental enunciations.

A further discussion of the points raised in this paper can be found in a forthcoming longer (but equally non-mathematical) essay in the British Journal for the Philosophy of Science.

(Dublin. Institute for Advanced Studies)

Blog Hole Memory Rescue and Lost Papers that were Really Lost

800px-Variation
There is more than one path to classical mechanics.

So much to do, so little time.  My own lost paper work (i.e. the translation of some of Hilbert’s old papers that are not available in English) is commencing at a snail’s pace, but at Kingsley Jones’ blog we can learn about some papers that were truly lost and misplaced and that he only recovered because throughout all the moves and ups and downs of life, his parents have been hanging on to copies of the unpublished  pre-prints.  Kingsley’s post affected me on a deeper level than the usual blog fare, because this is such a parent thing to do.  Having (young) kids myself, I know exactly the emotional tug to never throw away anything they produce, even if they have seemingly moved on and abandoned it.  On the other hand, the recollection of how he found these papers when going through his parent’s belongings after they passed away, brings into sharp relief the fact that I have already begun this process for my father,  who has Alzheimer’s.  So many of his things (such as his piano sheet music) are now just stark reminders of all the things he can no longer do.

On a more upbeat note: The content of these fortuitously recovered papers is quite remarkable.  They expand on a formalism that Steven Weinberg developed, one that essentially allows you to continuously deform quantum mechanics, making it ever less quantum.  In the limit, you end up with a wave equation that is equivalent to the Hamiltonian extremal principal–i.e. you recover  classical mechanics and have a “Schrödinger equation” that always fully satisfies the Ehrenfest Theorem. In this sense, this mechanism is another route to Hamilton mechanicsThe anecdote of Weinberg’s reaction when he learned about this news is priceless.

Ehrenfest’s Theorem, in a manner, is supposed to be common sense mathematically formulated: QM expectation values of a system should obey classical mechanics in the classical limit.  Within the normal QM frameworks this usually works, but  the problem is that sometimes it does not, as every QM textbook will point out (e.g. these lecture notes).  Ironically, at the time of writing, the Wikipedia entry on the Ehrenfest Theorem does not contain this key fact, which makes it kind of missing the point (just another example that one cannot blindly trust Wikipedia content). The above linked lecture notes illustrate this with a simple harmonic oscillator example and make this observation:

“…. according to Ehrenfest’s theorem, the expectation values of position for this cubic potential will only agree with the classical behaviour insofar as the dispersion in position is negligible (for all time) in the chosen state.”

So in a sense, this is what this “classic  Schrödinger equation” accomplishes: a wave equation that always produces this necessary constraint in the dispersion.  Another way to think about this is by invoking the analogy between Feynman’s path integral and the classical extremal principle.  Essentially, as the parameter lambda shrinks for Kingsley’s generalized Schrödinger equation, the paths will be forced ever closer to the classically allowed extremal trajectory.

A succinct summation of the key math behind these papers can be currently found in Wikipedia, but you had better hurry, as the article is marked for deletion by editors following rather mechanistic notability criteria, by simply counting how many times the underlying papers were cited.

Unfortunately, the sheer number of citations is not a reliable measure with which to judge quality. A good example of this is the Quantum Discord research that is quite en vogue these days. It has recently been taken to task on R.R. Tucci’s blog. Ironically, amongst many other aspects, it seem to me that Kingsley’s approach may be rather promising to better understand decoherence, and possibly even put some substance to the Quantum Discord metric.

Explaining Quantum Computing – Blogroll Memory Hole Rescue

So what is quantum computing?

This is the most dreaded question for anybody involved with this field if posed by a friend or relative without a physics background.  When I am too tired or busy to make an honest effort, I usually answer by quoting Feynman’s quip on quantum mechanics (click here to listen to the man himself – the quote appears about 6:45 min into the lecture):

“A lot of people understand the theory of relativity in some way or other.  (..) On the other hand, I think I can safely say that nobody understands quantum mechanics.

The crux of the matter is that Quantum Computing derives its power from precisely the same attributes that make this realm of physics so alien to us.

It’s small comfort that greater minds than mine have been mulling this conundrum. For instance, a while back Scott Aaronson described his struggle to write a column for the New York Times that describes Quantum Computing.

Then there is Michael Nielsen’s take on it, a brilliant write-up illustrating why there is really no reason to expect a simple explanation.

But if this hasn’t utterly discouraged you then I have this little treat from D-Wave’s blog. You need to be willing to tolerate a little math, understanding that an expression like this

$ \displaystyle ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ \sum_{i} x_{i} $

means you are summing over a bunch of variables x indexed by i

$ \displaystyle ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ \sum_{i} x_{i} = x_{0}+x_{1}+x_{2}+ … $

Other than that it just requires you to contemplate Schroedinger’s light switches.  Just as his cat can be thought of as dead and alive at the same time, his light switches are in a superposition of On and Off.  Strictly speaking, D-Wave’s description is specific to their particular adiabatic quantum chip design, but nevertheless, if you get your head around this, you will have a pretty good idea why a Quantum Computer’s abilities go beyond the means of a classical Turing machine.

About Time – Blogroll Memory Hole Rescue

One of the most fascinating aspects of quantum information research is that it sheds light on the connections between informational and thermodynamic entropy, as well as how time factors into quantum dynamics.

I.e. Schroedinger Equation and Heisenberg picture are equivalent. Although in the former the wave-function changes with time in the latter the operator. Yet, we don’t actually have any experimental insight in when the changes under adiabatic development are actually realized, since by its very nature we only have discrete observations to work with. This opens up room for various speculations such as that the “passage of time” is actually an unphyiscal notion for an isolated quantum system between measurements (i.e. as expressed by Ulrich Mohrhoff in this paper).

Lot’s of material there for future posts. But before going there it’s a good idea to to revisit the oldest paradox on time with this fresh take on it by Perry Hooker.

 

Information is Physical

Even when the headlines are not gut-wrenching

Information processing is seldom that physical.

One of the most astounding theoretical predictions of the late 20th century was Landauer’s discovery that erasing memory is linked to entropy i.e. heat is produced whenever a bit is fully and irrevocably erased.  As far as theoretical work goes this is even somewhat intuitively understandable: After all increasing entropy essentially means moving to a less ordered phase state (technically a micro-ensemble that is less special). And what could be possibly be more ordered than a computer memory register?

Recently this prediction has been confirmed by a very clever experiment.  Reason enough to celebrate this with another “blog memory-hole rescue”:

If you ever wondered what the term “adiabatic” in conjunction with quantum computing means, Perry Hooker provides the answer in this succinct explanation. His logic gate discussion shows why Landauer’s principle has implications far beyond the memory chips, and in a sense, undermines the entire foundation of classical information processing.

Truly required reading if you want to appreciate why quantum computing matters.

Rescued From the Blogroll Memory Hole

During the week my professional life leaves me no time to create original content.  Yet, there is a lot of excellent material out there pertinent to the nascent quantum information industry. So to fill the inter-week void I think it is very worthwhile to try to rescue recent blogroll posts from obscurity.

Very relevant to the surprise that Scott Aaronson came around on D-Wave is Robert Tucci’s great technical review of D-Wave’s recent Nature paper.  If you are not afraid of some math and are tired of the void verbiage that passes for popular science journalism than this is for you.