Tag Archives: Kingsley Jones

The Meaning of Wave Mechanics and the Mongol Physics Project

Mongols knew that a horse was either dead or alive, but never in a state of superposition between the twain.

Kingsley Jones, an Australian theoretical physicist turned entrepreneur, recently introduced what he dubs Mongol physics, a bold undertaking to “fix” QM and QED.

The name is aptly chosen, because if he succeeds in this, most of academic physics will be as taken by surprise as Europe was when the Golden Horde arrived. After all, physics doesn’t perceive these theories as defective, despite the enduring confusion as to what QM interpretation makes the most sense.

Kingsley dubs Erwin Schrödinger “Mongul #1” and there is a good reason for this. Having just received my copy of his collected papers, the first thing I came across was this little gem that I include below. The fact that it reads just as relevant 60 years later speaks volumes.  The only thing that has changed since then is that clever ways were found to deal with the runaway infinities in QED, so that accurate numbers could be forced out of it. Schrödinger knew better than to hinge any of his arguments on these major technical challenges at the time.  Rather, the article details his discomfort with the Copenhagen interpretation based on very fundamental considerations.  Makes me wonder how he’d feel about the fact that his cat in a box, that he made up to mock the status quo, entered popular culture as a supposedly valid illustration of quantum weirdness.

(Austrian copyright protection expires after 70 years, yet due to the fact that scans of the article are freely accessible at this University of Vienna site, I assume this text to be already placed in the public domain and hence free for online reproduction.  Please note this is not a translation. Schrödinger was fluent in several languages and originally penned this in English)  

THE MEANING OF WAVE MECHANICS
by Erwin Schrödinger
(For the July Colloquium, Dublin 1952)

Louis de Broglie’s great theoretical discovery of the wave phenomenon associated with the electron was followed within a few years, on the one hand by incontrovertible experimental evidence (based on interference patterns) of the reality of the de Broglie waves (Davisson and Germer, G. P. Thomson), and on the other hand by a vast generalization of his original ideas, which embraces the entire domain of physics and chemistry, and may be said to hold the field today along the whole line, albeit not precisely in the way de Broglie and his early followers had intended.

For it must have given to de Broglie the same shock and disappointment as it gave to me, when we learnt that a sort of transcendental, almost psychical interpretation of the wave phenomenon had been put forward, which was very soon hailed by the majority of leading theorists as the only one reconcilable with experiments, and which has now become the orthodox creed, accepted by almost everybody, with a few notable exceptions. Our disappointment consisted in the following. We had believed. that the eigenfrequencies of the wave phenomenon, which were in exact numerical agreement with the, until then so called, energy levels, gave a rational understanding of the latter. We had confidence that the mysterious “fit and jerk theory” about the jump-like transition from one energy level to another was now ousted. Our wave equations could be expected to describe any changes of this kind as slow and actually describable processes. This hope was not informed by personal predilection for continuous description, but if anything, by the wish for any kind of description at all of these changes. It was a dire necessity. To produce a coherent train of light, waves- of 100 cm length and more, as is observed in fine spectral lines, takes a time comparable with the average interval between transitions. The transition must be coupled with the production of the wave train. Hence if one does not understand the transition, but only understands the “stationary states”, one understands nothing. For the emitting system is busy all the time in producing the trains of light waves, it has no time left to tarry in the cherished “stationary states”, except perhaps in the ground state.

Another disconcerting feature of the probability interpretation was and is that the wave function is deemed to change in two entirely distinct fashions; it is thought to be governed by the wave equation as long as no observer interferes with the system, but whenever an observer makes a measurement, it is deemed to change into an eigenfunction of that eigenvalue of the associated operator that he has measured. I know only of one timid attempt (J. von Neumann in his well known book) to put this “change by measurement” to the door of a perturbing operator introduced by the measurement, and thus to have it also controlled solely by the wave equation. But the idea was not pursued, partly because it seemed unnecessary to those who were prepared to swallow the orthodox tenet, partly because it could hardly be reconciled with it. For in many cases the alleged change involves an actio in distans, which would contradict a firmly established principle, if the change referred to a physical entity. The non-physical character of the wave function (which is sometimes said to embody merely our knowledge) is even more strongly emphasized by the fact that according to the orthodox view its change by measurement is dependent on the observer’s taking cognizance of the result. Moreover the change holds only for the observer who does. If you are present, but are not informed of the result, then for you even if you have the minutest knowledge both of the wave function before the measurement and of the appliances that were used, the changed wave function is irrelevant, not existing, as it were; for you there is, at best, a wave function referring to the measuring appliances plus the system under consideration, a wave function in which the one adopted by the knowing observer plays no distinguished role.

M. de Broglie, so I believe, disliked the probability interpretation of wave mechanics as much as I did. But very soon and for a long period one had to give up opposing it, and to accept it as an expedient interim solution. I shall point out some of the reasons why the originally contemplated alter-native seemed deceptive and, after all, too naive. The points shall be numbered for later reference; the illustrating examples are representative of wide classes.

  • i) As long as a particle, an electron or proton etc., was still believed to be a permanent, individually identifiable entity, it could not adequately be pictured in our mind as a wave parcel. For as a rule, apart from artificially constructed and therefore irrelevant exceptions, no wave parcel can be indicated which does not eventually disperse into an ever increasing volume of space.
  • .
  • ii) The original wave-mechanical model of the hydrogen atom is not self-consistent. The electronic cloud effectively shields the nuclear charge towards outside, making up a neutral whole, but is inefficient inside; in computing its structure its own field that it will produce must not be taken into account, only the field of the nucleus.
  • .
  • iii) It seemed impossible to account for e.g. Planck’s radiation formula without assuming that a radiation oscillator (proper mode of the hohlraum) can only have energies nhν, with n an integer (or perhaps a half odd integer). Since this holds in all cases of thermodynamic equilibrium that do not follow the classical law of equipartition we are thrown back to the discrete energy states with abrupt transitiona between them, and thus to the probability interpretation.
  • .
  • iv) Many non-equilibrium processes suggest even more strongly the “transfer of whole quanta”; the typical, often quoted example is the photoelectric effect, one of the pillars of Einstein’s hypothesis of light quanta in 1905.

All this was known 25 years ago, and abated the hopes of “naive” wave-mechanista. The now orthodox view about the wave function as “probability amplitude” was put forward and was worked out into a scheme of admirable logical consistency. Let us first review the situation after the state of knowledge we had then. The view suggested by (iii) and (iv), that radiation oscillators, electrons and similar constituents of observable systems always find themselves at one of their respective energy levels except when they change abruptly to another one handing the balance over to, or receiving it from, some other system, this view, so I maintain, is in glaring contradiction with the above mentioned scheme in spite of the admirable logical self-consistency of the latter. For one of the golden rules of this scheme is, that any observable is always found at one of its eigenvalues, when you measure it, but that you must not say that it has any value, if you do not measure it. To attribute sharp energy values to all those constituents, whose energies we could not even dream of measuring (except in a horrible nightmare), is not only gratuitous but strictly forbidden by this rule.

Now let us review the situation as it is today. Two new aspects have since arisen which I consider very relevant for reconsidering the interpretation. They are intimately connected. They have not turned up suddenly. Their roots lie far back, but their bearing was only very gradually recognized.

I mean first the recognition that the thing which has always been called a particle and, on the strength of habit, is still called by some such name is, whatever it may be, certainly not an individually identifiable entity. I have dwelt on this point at length elsewhere [“Endeavour”, Vol.IX, Number 35, July 1950; reprinted in the Smithsonian Institution Report for 1950, pp. 183, – 196; in German “Die Pyramide”, Jan. and Feb. 1951 (Austria)]. The second point is the paramount importance of what is sometimes called “second quantization”.

To begin with, if a particle is not a permanent entity, then of the four difficulties labelled above, (i) is removed. As regards (ii), the quantization of de Broglie’s waves around a nucleus welds into one comprehensive scheme all the 3n-dimensional reprasentations that I had. proposed for the n-body problems. It is not an easy scheme, but it is logically clear and it can be so framed that only the mutual Coulomb energies enter.

As regards (iii) – keeping to the example of black body radiation – the situation is this. If the radiation is quantized each radiation oscillator (proper mode) obtains the frequencies or levels nhν. This is sufficient to produce Planck’s formula for the radiation in a cavity surrounded by a huge heat bath. I mean to say, the level scheme suffices: it is not necessary to assume that each oscillator is at one of its levels, which is absurd from any point of view. The same holds for all thermodynamical equilibria. I have actually given a general proof of this in the last of my “Collected Papers” (English version: Blackie and Son, Glasgow 1928). A better presentation is added as an appendix to the forthcoming 2nd impression of “Statistical Thermodynamics” (Cambridge University Press).

Under (iv) we alluded to a vast range of phenomena purported to be conclusive evidence for the transfer of whole quanta. But I do not think they are, provided only that one holds on to the wave aspect throughout the whole process. One must, of course, give up thinking of e.g. an electron as of a tiny speck of something moving within the wave train along a mysterious unknowable path. One must regard the “observation of an electron” as an event that occurs within a train of de Broglie waves when a contraption is interposed in it which by its very nature cannot but answer by discrete responses: a photographic emulsion, a luminescent screen, a Geiger counter. And one must, to repeat this, hold on to the wave aspect throughout. This includes, that the equations between frequencies and frequency differences, expressing the resonance condition that governs wave mechanics throughout, must not be multiplied by Planck’s constant h and then interpreted as tiny energy balances of microscopic processes between tiny specks of something that have, to say the least, no permanent existence.

This situation calls for a revision of the current interpretation, which involves computing transition probabilities from level to level, and disregards the fact that the wave equation, with few exceptions if any, indicates nothing of the sort, but leads each of the reacting systems into a state composed of a wide spread of energy eigenstates. To assume that the system actually leaps into just one of them which is selected by “playing dice”, as it were, is not only gratuitous, but as was pointed out above, contradicts in most cases even the current interpretation. These inconsistencies will be avoided by returning to a wave theory that is not continually abrogated by dice-miracles; not of course to the naive wave theory of yore, but to a more sophisticated one, based on second quantization and the non-individuality of “particles”. Originating from contraptions that by their very nature cannot but give a discrete, discontinuous response, the probability aspect has unduly entered the fundamental concepts and has domineeringly dictated the basic structure of the present theory.

In giving it up we must no longer be afraid of losing time-honoured atomism. It has its counterpart in the level-scheme (of second quantization) and nowhere else. It may be trusted to give atomism its due, without being aided by dice-playing.

To point here to the general failure of the present theory to obtain finite transition probabilities and finite values of the apparent mass and charge, might seem a cheap argument and a dangerous one at that. The obvious retort would be: Can you do better, sir? Let me frankly avow that I cannot. Still I beg to plead that I am at the moment groping for my way almost single-handed, as against a host of clever people doing their best along the recognized lines of thought.

But let me still draw attention to a point that is seldom spoken of. I called the probability interpretation a scheme of admirable logical consistency. Indeed it gives us a set of minute prescriptions, not liable ever to be involved in contradiction, for computing the probability of a particular outcome of any intended measurement, given the wave function and the hermitian operator associated with that particular measuring device. But, of course, an abstract mathematical theory cannot possibly indicate the rules for this association between operators and measuring devices. To describe one of the latter is a long and circumstantial task for the experimentalist. Whether the device which he recommends really corresponds to the operator set up by the theorist, is not easy to decide. Yet this is of paramount importance. For a measuring appliance means now much more than it did before the advent of quantum mechanics and of its interpretation which I am opposing here. It has a physical influence on the object; it is deemed to press it infallibly into one of the eigenstates of the associated operator. If it fails to put it in an eigenstate belonging to the value resulting from the measurement, the  latter is quantum-mechanically not repeatable. I cannot help feeling that the precariousness of the said association makes that beautiful, logically consistent theoretical scheme rather void. At any rate its contact with actual laboratory work is very different from what one would expect from its fundamental enunciations.

A further discussion of the points raised in this paper can be found in a forthcoming longer (but equally non-mathematical) essay in the British Journal for the Philosophy of Science.

(Dublin. Institute for Advanced Studies)

Blog Hole Memory Rescue and Lost Papers that were Really Lost

800px-Variation
There is more than one path to classical mechanics.

So much to do, so little time.  My own lost paper work (i.e. the translation of some of Hilbert’s old papers that are not available in English) is commencing at a snail’s pace, but at Kingsley Jones’ blog we can learn about some papers that were truly lost and misplaced and that he only recovered because throughout all the moves and ups and downs of life, his parents have been hanging on to copies of the unpublished  pre-prints.  Kingsley’s post affected me on a deeper level than the usual blog fare, because this is such a parent thing to do.  Having (young) kids myself, I know exactly the emotional tug to never throw away anything they produce, even if they have seemingly moved on and abandoned it.  On the other hand, the recollection of how he found these papers when going through his parent’s belongings after they passed away, brings into sharp relief the fact that I have already begun this process for my father,  who has Alzheimer’s.  So many of his things (such as his piano sheet music) are now just stark reminders of all the things he can no longer do.

On a more upbeat note: The content of these fortuitously recovered papers is quite remarkable.  They expand on a formalism that Steven Weinberg developed, one that essentially allows you to continuously deform quantum mechanics, making it ever less quantum.  In the limit, you end up with a wave equation that is equivalent to the Hamiltonian extremal principal–i.e. you recover  classical mechanics and have a “Schrödinger equation” that always fully satisfies the Ehrenfest Theorem. In this sense, this mechanism is another route to Hamilton mechanicsThe anecdote of Weinberg’s reaction when he learned about this news is priceless.

Ehrenfest’s Theorem, in a manner, is supposed to be common sense mathematically formulated: QM expectation values of a system should obey classical mechanics in the classical limit.  Within the normal QM frameworks this usually works, but  the problem is that sometimes it does not, as every QM textbook will point out (e.g. these lecture notes).  Ironically, at the time of writing, the Wikipedia entry on the Ehrenfest Theorem does not contain this key fact, which makes it kind of missing the point (just another example that one cannot blindly trust Wikipedia content). The above linked lecture notes illustrate this with a simple harmonic oscillator example and make this observation:

“…. according to Ehrenfest’s theorem, the expectation values of position for this cubic potential will only agree with the classical behaviour insofar as the dispersion in position is negligible (for all time) in the chosen state.”

So in a sense, this is what this “classic  Schrödinger equation” accomplishes: a wave equation that always produces this necessary constraint in the dispersion.  Another way to think about this is by invoking the analogy between Feynman’s path integral and the classical extremal principle.  Essentially, as the parameter lambda shrinks for Kingsley’s generalized Schrödinger equation, the paths will be forced ever closer to the classically allowed extremal trajectory.

A succinct summation of the key math behind these papers can be currently found in Wikipedia, but you had better hurry, as the article is marked for deletion by editors following rather mechanistic notability criteria, by simply counting how many times the underlying papers were cited.

Unfortunately, the sheer number of citations is not a reliable measure with which to judge quality. A good example of this is the Quantum Discord research that is quite en vogue these days. It has recently been taken to task on R.R. Tucci’s blog. Ironically, amongst many other aspects, it seem to me that Kingsley’s approach may be rather promising to better understand decoherence, and possibly even put some substance to the Quantum Discord metric.

The Unbearable Lightness of Quantum Mechanics

Updated below.

Gravity and Quantum Mechanics don’t play nice together. Since Einstein’s time, we have two towering theories that have defied all attempts by some very smart people to be reconciled. The Standard Model, built on the foundations of quantum mechanics, has been spectacularly successful. It allows the treatment of masses acquired from the binding energies, and, if the Higgs boson confirmation pans out, accounts for the elemental rest masses – but it does not capture gravity. (The current mass generation models that involve gravity are all rather speculative at this point.)

Einstein’s General Relativity has been equally successful in explaining gravity as innate geometric attributes of space and time itself. It has survived every conceivable test and made spectacular predictions (such as gravity lenses).

On the surface this dysfunctional non-relationship between the two major generally accepted theoretical frameworks seems very puzzling. But it turns out that the nature of this conundrum can be described without recourse to higher math (or star-trek like animations with a mythical sound-track).

Much has been written about the origin of this schism: The historic struggle for the interpretation of Quantum Mechanics, with Einstein and Bohr being the figureheads of the divided physics community at the time. Mendel Sachs (who, sadly, passed away recently) drew the following distinction between the philosophies of the two fractions:

[The Copenhagen Interpretation views] Schroedinger’s matter waves as [complex] waves of probability. The probability was then tied to quantum mechanics as a theory of measurement – made by macro observers on micro-matter. This view was then in line with the positivistic philosophy, whereby the elements of matter are defined subjectively in terms of the measurements of their properties, expressed with a probability calculus. […] Einstein’s idea [was] that the formal expression of the probability calculus that is called quantum mechanics is an incomplete theory of matter that originates in a complete [unified] continuous field theory of matter wherin all of the variables of matter are ‘predetermined’.

(From Quantum Mechanics and Gravity)

These days, the Copenhagen Interpretation no longer reigns supreme, but has some serious competition: E.g. one crazy new kid on the block is the Many World Interpretation.  (For an insightful take on MWI I highly recommend this recent blog post from Scott Aaronson).

But the issue goes deeper than that. No matter what interpretation you favor, one fact remains immutable: Probabilities will always be additive, mathematically they behave in a linear fashion. This, despite its interpretative oddities, makes Quantum Mechanics fairly easy to work with.  On the other hand, general relativity is an intrinsically non-linear theory.  It describes a closed system in which the field, generated by gravitating masses, propagates with finite speed and, in a general, non-equilibrium picture, dynamically affects these masses, in turn rearranging the overall field expression.  (Little wonder Einstein’s field equations only yield to analytical solutions for drastically simplified scenarios).

There is no obvious way to fit Quantum Mechanics, this linear peg, into this decidedly non-linear hole.

Einstein considered Quantum Mechanics a theory that would prove to be an approximation of a fully unified field theory.  He spent his last years chasing after this goal, but never achieved it. Mendel Sachs claims to have succeeded where he failed, and indeed presents some impressive accomplishments, including a way to derived the quantum mechanics structure from extended General Relativity field equations.  What always struck me as odd is how little resonance this generated, although this clearly seems to be an experience shared by other theoretical physicists who work off the beaten path. For instance, Kingsley Jones approaches this same conundrum from a completely different angle in his original paper on Newtonian Quantum Gravity. Yet the citation statistic shows that there was little up-take.

One could probably dedicate an entire blog speculating on why this kind of research does not break into the mainstream, but I would rather end this with the optimistic notion that in the end, new experimental data will hopefully rectify this situation. Although the experiment on a neutral particle Bose-Einstein condensate proposed in Kingsley Jones’ paper has little chance of being performed unless there is some more attention garnered, other experiments to probe the domain where gravity and quantum mechanics intersect get a more lofty treatment: For instance this paper was featured in Nature although its premise is probably incorrect. (Sabine Hossenfelder took Nature and the authors to task on her blog – things get a bit acrimonious in the comment section).

Nevertheless, it is encouraging to see such a high profile interest in these kinds of experiments, chances are we will get it right eventually.

Update

Kingsley Jones (who’s 1995 paper paper I referenced above) has a new blog entry that reflects on the historic trajectory and current state of quantum mechanics.  I think it’s fair to say that he does not subscribe to the Many World Interpretation.