All posts by Henning Dekant

Time Crystal – A New Take on Perpetual Motion

Update: Here’s the link to Wilczek time crystal paper

Not a time crystal but perpetually moving at room temperature. (Illustration of Nitrogen-inversion).

It is a given that at room temperature there is plenty of perpetual chaotic and truly perpetual motion to be had.  And sometimes this motion takes on some more organized forms as is the case with Nitrogen inversion.

Also it is well established that unexpected movements can occur close to absolute zero, when for instance superfluid liquids climb up the walls of their containment.

In general, unperturbed quantum systems develop in a unitary manner (i.e. a kind of movement) and will do so perpetually, until measured.

In case of super-sized Rydberg atoms you can also approach an almost classical orbit (and that should hold at very low temperatures as well).  But to have sustained, detectable perpetual motion in the ground state of a system at absolute zero would be a new quality.

That is what “Time Crystals” might be adding to the quantum cabinet of oddities.  The idea that lead to this theoretical prediction, formulated by Frank Wilczek, is indeed quite clever:

“I was thinking about the classification of crystals, and then it just occurred to me that it’s natural to think about space and time together, (…) So if you think about crystals in space, it’s very natural also to think about the classification of crystalline behavior in time.”

It’ll be up to some creative experimentalist to determine if the resulting theory holds water.  If so, this may open up an interesting new venue to tackle the frustrating problem of getting General Relativity (where space and time is a combined entity) and QM to play together.

So You Want to Learn About Quantum Computing?

“Students will learn by inhabiting an alternate history where Alan Turing and Richard Feynman meet during World War II and must invent quantum computers to defeat Nazi Germany. As a final project, they will get to program a D-Wave One machine and interpret its results.”

If you are based in Seattle then you want to keep an eye out for when Paul Pham next teaches the Quantum Computing for Beginners course that follows the exciting narrative outlined above.

For everybody else, there is EdX‘s CS191x Quantum Mechanics and Quantum Computation course.  I very much hope this course will a be a regular offering.  Although it lacks the unique dramatic arche of P.Pham’s story line this course is nevertheless thoroughly enjoyable.

When I signed up for this course, I didn’t know what to expect.  Mostly, I decided to check it out because I was curious to see how the subject would be taught, and because I wanted to experience how well a web-based platform could support academic teaching.

This course fell during an extremely busy time, not only because of a large professional work load, but also because the deteriorating health of my father required me to fly twice from Toronto to my parents in Germany.  Despite this, the time required for this course proved to be entirely manageable.  If you have an advanced degree in math, physics or engineering, and want to learn about Quantum Computing, you shouldn’t shy away from taking this course as long as you have an hour to spare each week.  It helps that you can accelerate the video lectures to 1 1/2 normal speed (although this made Prof. Umesh Vazirani sound a bit like he inhaled helium).

Prof. Vazirani is a very competent presenter, and you can tell that a lot of thought went into how to approach the subject, i.e. how to ease into the strangeness of Quantum Mechanics for those who are new to it. I was suspicious of the claim made at the outset, that the required mathematics would be introduced and developed as needed during the course, but it seems to me that this was executed quite well. (Having been already familiar with the required math, I don’t really know if it’ll work for somebody completely new to it, but it seems to me that indeed the only pre-requisite required was a familiarity with linear algebra).

It is interesting to see discussions posted by individuals who took the course and were apparently subjected to QM for the first time.  One such thread started this way:

“I got 100. It was really a fun. Did I understand anything? I would say I understood nothing.”

To me this illuminates the fact that you simply cannot avoid the discussion of the interpretation of quantum mechanics.  Obviously this subject is still very contentious, and Prof. Vazirani touched on it when discussing the Bell inequalities in a very concise and easy to understand manner.  Yet, I think judging from the confusion of these ‘straight A’ students there needs to be more of it.  It is not enough to assert that Einstein probably would have reconsidered his stance if he knew about these results.  Yes, he would have given up on a conventional local hidden variable approach, but I am quite certain his preference would have then shifted to finding a topological non-local field theory.

Of course, there is only so much that can be covered given the course’s duration. Other aspects there were missing: Quantum Error Correction, Topological and Adiabatic Quantum Computing and especially Quantum Annealing.  The latter was probably the most glaring omission, since this is the only technology in this space that is already commercially available.

Generally, I found that everything that was covered, was covered very well.  For instance, if you ever wondered how exactly Grover’s and Shor’s algorithms work, you will have learned this after taking the course. I especially found the homework assignments wonderful brain teasers that helped me take my mind off of more worrisome issues at hand.  I think I will miss them. They were comprised of well thought out exercises, and as with any science course, it is really the exercises that help you understand and learn the material.

On the other hand, the assignments and exams also highlighted the strengths and weaknesses of the technology underlying the courseware.  Generally, entering formulas worked fine, but sometimes the solver was acting up and it wasn’t always entirely clear why (i.e. how many digits were required when giving a numerical answer, or certain algebraically equivalent terms were not recognized properly).  While this presented the occasional obstacle, on the upside you get the immediate gratification of instance feedback and a very nice progress tracking that allows you to see exactly how you are doing. The following is a screenshot of my final tally. The final fell during a week in which I was especially hard pressed for time, and so I slacked off, just guesstimating the last couple of answers (with mixed results).  In comparison to a conventional class, knowing exactly when you have already achieved a passing score via the tracking graph makes this a risk- and stress-free strategy.

Screen Shot 2013-04-27 at 11.56.31 AMA common criticism of online learning in comparison to the established ways of doing things is the missing classroom experience and interaction with the professor and teaching staff.  To counter this, discussion boards were linked to all assignments, and discussion of the taught material was encouraged.  Unfortunately, since my time was at a premium I couldn’t participate as much as I would have liked, but I was positively surprised with how responsive the teaching assistants answered questions that were put to them (even over the weekends).

This is all the more impressive given the numbers of students that were enrolled in this course:

The geographic reach was no less impressive:

Having being sceptical going into this, I’ve since become a convert.  Just as Khan Academy is revolutionizing the K12 education, EdX and similar platforms like Cousera represent the future for academic teaching.

 

Stretching Quantum Computing Credulity

Update: Corrected text (h/t Geordie)

My interest in D-Wave prompted me to start this blog, and it is no secret that I expect the company to deliver products that will have a significant impact on the IT market. Yet, to this day, I encounter the occasional low-information posters in various online forums who dismiss the company, smugly asserting that they are fraudulent and only milk their investors.

08-finanzbetrugIt would be one thing if you’d only encounter some hold-outs on Scott Aaranson’s blog, which originally emerged as the most prominent arch-nemesis before he moderated his stance.  But it’s actually an international phenomenon, as I just came across such a specimen in a German IT forum.

To understand where these individuals are coming from, it is important to consider how people usually go about identifying a “high tech” investment scam.  The following list makes no claim to be complete, but is a good example of the hierarchy of filters to forming a quick judgment (h/t John Milstone):

  1. Claims of discovering some new physics that has been overlooked by the entire scientific world for centuries. (For each example of this actually happening, there are hundreds or thousands of con men using this line).
  2. Eagerness to produce “demos” of the device, but refusal to allow any independent testing. In particular, any refusal to even do the demos anywhere other than his own facilities is a clear warning sign (indicating that the facilities are somehow “rigged”).
  3. Demos that only work when the audience doesn’t contain competent skeptics.
  4. Demos that never demonstrate the real claims of the “inventor”.
  5. Lying about business relationships in order to “borrow” credibility from those other organizations.
  6. Failing to deliver on promises.
  7. Continually announcing “improvements” without ever delivering on the previous promises. This keeps the suckers pacified, even though the con man is never actually delivering.

One fateful day, when D-Wave gave an initial presentation to an IT audience, they inadvertently set a chain in motion that triggered several of these criteria in the minds of a skeptical audience.

Of course D-Wave never claimed new physics, but ran afoul of theoretical computer science when claiming that its computer can efficiently take on a NP hard problem, given as a Sudoku puzzle irritated theoretical computer scientists when claiming that its computer can take on a Sudoku puzzle (the latter is known to be NP hard.) (#1). [Ed. Changed wording to make clear that D-Wave didn’t explicitly claim to efficiently solve NP hard Soduko.]

At the time, D-Wave was still not ready to allow independent testing (#2) and the audience did not contain theoretical computer scientists who would have challenged scrutinized the company’s claims (#3).

Subsequently, critics questioned how much the quantum computing chip was actually engaged in solving the demonstrated Sudoku puzzle, since a normal computer was also in the mix.  Scott Aaranson also pointed out that there was no way of knowing if actual entanglement was happening on the chip, and as such the demo wasn’t proving D-Wave’s central claim (#4).

To my knowledge, D-Wave never misrepresented any business relationships, but touting their relationship with Google may have inadvertently triggered criteria #5 in some people’s minds.

Although D-Wave has been rapidly increasing their chip’s integration density, and are now shipping a product that I expect to outperform conventional hardware, they didn’t deliver as quickly as initially anticipated (#6).

Criteria #7 held until they shipped the D-Wave One to Lockheed, and this marked the turning point after which the pattern rapidly unraveled.  Only people who haven’t paid attention could still hold on to the “investment fraud” canard:

  • D-Wave published internals of their machine in Nature and co-authored several papers that utilize their machine for research as diverse as Ramsey number calculations and protein folding.
  • Independent testers are now able to test the machine.  I can verify that the one tester I am corresponding with is a top notch academic from one of the best engineering and science faculties this world has to offer.  He is also fiercely independent, believing that he can outperform the D-Wave machine with hand-optimized code on a conventional chip.
  • The central claim that their chip is a true quantum chip leveraging massive qubit entanglement has been proven.

It’s time for the IT audience to come to terms with this.

Quantum computing has arrived.  It’s real. Better get used to it.

 

 

If a Fighter Writes a Paper to go for the Kill …

You don’t want to take on this man in the rink:

And you don’t want to take on his namesake in the scientific realm.

In my last post I wrote about the Kish Cypher protocol, and was wondering about its potential to supplant Quantum Cryptography.

The very same same day, as if custom ordered, this fighter’s namesake, no other than Charles Bennett himself, published this pre-print paper (h/t Alessandro F.).

It is not kind on the Kish Cipher protocol, and that’s putting it mildly.  To quote from the abstract (emphasis mine):

We point out that arguments for the security of Kish’s noise-based cryptographic protocol have relied on an unphysical no-wave limit, which if taken seriously would prevent any correlation from developing between the users. We introduce a noiseless version of the protocol, also having illusory security in the no-wave limit, to show that noise and thermodynamics play no essential role. Then we prove generally that classical electromagnetic protocols cannot establish a secret key between two parties separated by a spacetime region perfectly monitored by an eavesdropper. We note that the original protocol of Kish is vulnerable to passive time-correlation attacks even in the quasi-static limit.

Ouch.

The ref’s counting …

Quantum Cryptography Made Obsolete?

The background story.

Electrical engineering is often overshadowed by other STEM fields. Computer Science is cooler, and physics has the aura of the Faustian quest for the most fundamental truths science can uncover.  Yet, this discipline produced a quite remarkable bit of research with profound implications for Quantum Information Science.  It is not very well publicized. Maybe that is because it’s a bit embarrassing to the physicists and computer scientists who are heavily vested in Quantum Cryptography?

After all, the typical, one-two punch elevator-pitch for QIS is entirely undermined by it. To recap, the argument goes likes this:

  1. Universal Quantum Computing will destroy all effective cryptography as we know it.
  2. Fear not, for Quantum Cryptography will come to your rescue.

Significant funds went into the latter.  And it’s not like there isn’t some significant progress, but what if all this effort proved futile because an equally strong encryption could be had with far more robust methods?  This is exactly what the Kish Cypher protocol promises. It has been around for several years, and in a recent paper, Laszlo Bela Kish discusses several variations of his protocol that he modestly calls the Kirchhoff-Law-Johnson-(like)-Noise (KLJN) secure key exchange – although otherwise it goes by his name in the literature. A 2012 paper that describes the principle behind it can be found here.  The abstract of the latter makes no qualms about the challenge to Quantum Information Science:

It has been shown recently that the use of two pairs of resistors with enhanced Johnson-noise and a Kirchhoff-loop—i.e., a Kirchhoff-Law-JohnsonNoise (KLJN) protocol—for secure key distribution leads to information theoretic security levels superior to those of a quantum key distribution, including a natural immunity against a man-in-the-middle attack. This issue is becoming particularly timely because of the recent full cracks of practical quantum communicators, as shown in numerous peer-reviewed publications.

There are some commonalities between quantum cryptography and this alternative, inherently safe, protocol.  The obvious one is that they are both key exchange schemes; The more interesting one is that they both leverage fundamental physics properties of the systems that they are employing.  In one case, it is the specific quantum correlations of entangled qubits, in the other, the correlations in classical thermodynamic noise (i.e. the former picks out the specific quantum entanglement correlations of the systems density matrix, the latter only requires the classical entries that remain after decoherence and tracing of the density matrix).

Since this protocol works in the classical regime, it shouldn’t come as a surprise that the implementation is much easier to accomplish than when having to accomplish and preserve an entangled state. The following schematic illustrates the underlying principle:

Core of the KJLN secure key exchange system. Alice encodes her message by connecting these two resistors to the wire in the required sequence. Bob, on the other hand, connects his resistors to the wire at random.

The recipient (Bob) connects the wire at random in predefined synchronicity with the sender (Alice).  The actual current and voltage through the wire is random, ideally Johnson noise. The resistors determine the characteristic of this voltage, Bob can determine what resistor Alice used because he knows which one he connected, but the Fluctuation Dissipation Theorem ensures that wire-tapping by an attacker (Eve) is futile.  The noise characteristics of the signal ensure that no information can be extracted from it.

Given that the amount of effort and funding that goes into Quantum Cryptography is substantial (some even mock it as a distraction from the ultimate prize which is quantum computing), it seems to me that the fact that classic thermodynamic resources allow for similar inherent security should give one pause.  After all, this line of research may provide a much more robust approach to the next generation,”Shor safe”, post quantum encryption infrastructure.

The Dark Horse of Quantum Computing

Updated below.

Recently, Science magazine prominently featured Quantum Information Processing on their cover:

 

The periodical has a great track record in publishing on QIS, and this is the main reason why I subscribe to it.

Unfortunatelly, reading this issue, yet again drove home what a dark horse enterprise D-Wave is. And this is despite some recent prominent news, that D-Wave effortlessly passed a test devised to check for the quality of entanglement that they realize on their chip. There is hardly any lingering doubt that they managed to leverage real quantum annealing, yet, neither their approach, nor adiabatic quantum computing, is featured at all in this issue of Science.  In the editorializing introduction to the cover story dubbed “the Future of Quantum Information Processing” these fields aren’t even mentioned in passing.  Are we to conclude that there is no future for adiabatic quantum computing?

This I found so puzzling, that it prompted me to write my first ever letter to the editors of Science:

The Science journal has been invaluable in the past in advancing the QIS field, publishing an impressive roster of ground breaking papers. Yet, it seems to me the editorializing introduction of the March 8th cover story by Jelena Stajic dropped the ball.

If QIS is prominently featured on the cover of your journal shouldn’t the reader at least expect a cursory exposition of all prominent developments in the entire field? There is nothing wrong with the authors of the paper on the superconducting Josephson junctions approach to QC, restricting themselves to universal gate based architectures. Nevertheless, at least in the accompanying editorial, I would have expected a nod towards adiabatic quantum computing, and approaches utilizing quantum annealing. This oversight seems all the more glaring as the latter already resulted in a commercial offering.

Sincerely,

Henning Dekant

Disclaimer: I am not affiliated with the company D-Wave, which ships the first commercial quantum computing device, just puzzled that an exemplary publication like Science doesn’t feature the entire spectrum of approaches towards quantum computing.

My bet with a sceptical QC and CIS expert is still outstanding, and in my exchange with him, he mentioned that he didn’t expect D-Wave to pass this first entanglement hurdle. The next one to pass now is the matter of actual benchmarking against established chip architectures.

If D-Wave’s One cannot outperfom a conventional single-threaded architecture I’ll lose 4 gallons of maple syrup, but even if that was to come to pass, it wouldn’t spell the end for D-Wave, as it’ll be just a matter of increasing the qbit density until a quantum annealing chip will surpass conventional hardware. The latter only improves linearly with the integration density, while a quantum chip’s performance grows exponentially with the numbers of qbits that it can bring to bear.

Update:

Without further comment, here is the answer that I received from Science:

Thank you for your feedback regarding the introductory page to our recent QIP special section. I appreciate the point you are making, and agree that quantum annealing is an important concept. Let me, however, clarify my original reasoning. The Introduction was aimed at the general reader of Science, which, as you are aware, has a very broad audience. It was not meant to be an exhaustive account, or to complement the reviews in the issue, but rather to serve as a motivation for covering the topic, and hopefully to induce a non-specialist reader to delve into the reviews, while introducing only a minimal set of new concepts.

I hope that this is helpful, and once again, I am grateful for your feedback.
Best regards,
Jelena Stajic
Associate Editor

Fun Stuff: When Shakespeare meets Schrödinger

Shakespeares_cat

In the associated LinkedIn discussion to my previous post, commenters had some fun with the Shakespeare inspired headline. Clearly, if Shakespeare would have known Quantum Mechanics and the superposition that holds Schrödinger’s cat in limbo, some of the classic pieces would have sounded slightly different. Dr. Simon J.D. Phoenix had this brilliant take on it:

“To be, or not to be, or maybe both

–that is the question:
Whether ’tis nobler in the mind to calculate
The slings and arrows of outrageous quanta
Or to take arms against a sea of interpretations
And by opposing end them.
To sleep, to wake —
No more, but both –and by a sleep to say we end
The headache, and the thousand natural shocks
That Bohr bequeathed. ‘Tis a consummation
Devoutly to be wished. To wake, to sleep–
To sleep–perchance to dream: ay, there’s the rub,
For in that sleep of Copenhagen what dreams may come
When we have shuffled all our mortal calculations,
Must give us pause. There’s the Aspect
That makes calamity of so entangled a life.
For who would bear the Bells and Wittens of time,
Th’ position’s wrong, the proud momentum’s contumely
The pangs of despised theory, the quantal law’s decay,
The insolence of academic office, and the spurns
That patient merit of th’ unworthy unlearned takes,
When he himself might his quietus make
With a bare bra-ket? Who would fardels bear,
To grunt and sweat under a weary state vector,
But that the dread of something not quite real,
The undiscovered counterfactual, from whose bourn
No traveller returns, puzzles the will,
And makes us rather bear those classical ills we have
Than fly to others that we know not of?
Thus common sense does make cowards of us all,
And thus the native hue of resolution
Is sicklied o’er with the pale cast of Heisenberg,
And enterprise of great position and momentum
With this regard their currents turn awry
And lose the name of action. — Soft you now,
The fair Dirac — noble and precise, in thy orisons
Be all my spins remembered.”

Nobel Laureates on the QM Interpretation Mess

Update:  Perusing the web I noticed that John Preskill [not yet a Nobel laureate 🙂 ] also blogged on the same survey.  Certainly another prominent voice to add to the mix.

~~~

In the LinkedIn discussion to my earlier blog post that was lamenting the disparate landscape of QM interpretation, I had Nobel laureate Gerard ‘t Hooft weighing in:

Don’t worry, there’s nothing rotten. The point is that we all agree about how to calculate something in qm. The interpretation question is something like: what language should one use to express what it is that is going on? The answer to the question has no effect on the calculations you would do with qm, and thus no effect on our predictions what the outcome of an experiment should be. The only thing is that the language might become important when we try to answer some of the questions that are still wide open: how do we quantize gravity? And: where does the Cosmological Constant come from? And a few more. It is conceivable that the answer(s) here might be easy to phrase in one language but difficult in another. Since no-one has answered these difficult questions, the issue about the proper language is still open.

His name certainly seemed familiar, yet due to some very long hours I am currently working, it was not until now that I realized that it was that ‘t Hooft.  So I answered with this, in hindsight, slightly cocky response:

Beg to differ, the interpretations are not more language, but try to answer what constitutes the measurement process. Or, with apologies to Ken, what “collapses the wave function”: The latter is obviously a physical process. There has been some yeoman’s work to better understand decoherence, but ultimately what I want to highlight is that this sate of affairs, of competing QM interpretation should be considered unsatisfactory. IMHO there should be an emphasis on trying to find ways to decide experimentally between them.

My point is we need another John Bell.  And I am happy to see papers like this that may allow us to rule out some many world interpretations that rely on classical probabilities.

So why does this matter?  It is one thing to argue that there can be only one correct QM interpretation, and that it is important to identify that one in order to be able to develop a better intuition for the quantum realities (if such a thing is possible at all).

But I think there are wider implications, and so I want to quote yet another Nobel laureate, Julian Schwinger, to give testament to how this haunted us when the effective theory of quantum electrodynamics was first developed (preface selected papers on QED 1956):

Thus also the starting point of the theory is the independent assignment of properties to the two fields, they can never be disengaged to give those properties immediate observational significance. It seems that we have reached the limits of the quantum theory of measurement, which asserts the possibility of instantaneous observations, without reference to specific agencies.  The localization of charge with indefinite precision requires for its realization a coupling with the electromagnetic field that can attain arbitrarily large magnitudes. The resulting appearance of divergences, and contradictions, serves to deny the basic measurement hypothesis.

John Bell never got one of these, because of his untimely death.

Something is Rotten in the State of Physics.

How else to explain that almost a century after the most successful modern physics theory has been coined leading experts in the field can still not agree on how to interpret it?

Exhibit (A) this bar chart from a survey taken at a quantum foundations meeting.  It has been called the most embarrassing  graph of modern physics (and rightly so).

Screen Shot 2013-02-23 at 11.25.29 AM

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Unsurprisingly, my favorite interpretation of QM, Ulrich Mohrhoff’s Pondicherry Interpretation, is such a dark horse candidate it did not even make the list.

In accordance with this main confusion, the view on the role of the observer is also all over the map:

Screen Shot 2013-02-23 at 11.46.52 AMThe majority settles on a statement that no matter how I try to parse it, doesn’t make any sense to me:  If our formalism describes nature correctly, and the observer plays a fundamental role in the latter, how is it supposed to not occupy a distinguished physical role? The cognitive dissonance to take this stance is dizzying. At least the quantum hippie choice of option (d) has some internal consistency.

So it shouldn’t come as a surprise that with regard to quantum computing these experts are as ignorant as the public at large and completely ignore that D-Wave is already shipping a quantum computer (if the phrasing was about a universal quantum computer these results would have been easier to tolerate).  Invited to opine on the availability of the first working and useful quantum computer this was the verdict:

 

 

 

 

 

 

 

The paper contains another graph that could almost parse as a work of art, it visualizes the medium to strong correlation between the survey answers.  To me it is the perfect illustration for the current State of Physics with regards to the interpretation of quantum mechanics:

It is a mess.

Given this state of affairs it’s small wonder that one of my heros, Carver Mead, recently described the QM revolution that started in the early last century as an aborted one. It is indeed time to kick-start it again.

My Fringe Science Problem

Updated below.

Cold Fusion. It should have been that simple.

It is past time to come clean. I have an addiction problem. It is said that the best way to confront this is to admit it: No matter how hard I try, I cannot stop paying attention to fringe science.

Of course I could blame early childhood experiences. Back when I was a teen, about three decades ago, the free energy crowd was already quite active, “Tachyon energy” was then their favorite pseudo science to justify how their fantastic contraptions could work. One of the free energy bulletins that I read credited an electrical engineer who just happened to be a very distant relative of mine. So I managed to get in touch with him. He was indeed engaging in some far-fetched research, but it had nothing to do with free energy, and he had never heard of the people who misrepresented him (at the time, he was researching if common radar microwave radiation played any role in the forest die-off that was widely attributed to acid rain, and to that end built some strange test machinery).

This is a pattern that I’ve seen repeated many times since then. The approach generally seems to follow these steps:

  1. Find some far fetched fringe research.
  2. Claim some tremendous break-through and purport that just a little bit of additional research will result in a fantastic product.
  3. Based on this, collect investment money and then retire early after the R&D effort unfortunately doesn’t work out.

The latest fringe science to receive this treatment is the text book example for pathological science, cold fusion. It has since been rebranded LENR for Low Energy Nuclear Reaction (it also goes by some other acronyms, but this seems to be the most common one).

One story that fits this pattern perfectly is that of the marvellous E-Cat (short for Energy Catalyzer). It sprang on the scene about two years ago with a tantalizing, if not independently supervised, demonstration that was convincing enough to bamboozle two Swedish physics profs into proclaiming that no chemical energy source could have the necessary energy density to produce the observed heat (conversion of water to steam). Over time this generated some mainstream news stories, and a bunch of blogs and forums sprung up to follow this story. One such blog followed an interesting trajectory: Paul Story, the maintainer of ecatnews.com, started out quite optimistic on this device, even banned some critical voices from the comment section of his blog. But then he was approached by the Australian billionaire Dick Smith who offered a prize of $1 million to anyone who could prove a usable 1KW LENR device. Nobody came forward to claim the money, although several commercial entities claimed to have just such prototypes. But this changed the tone at ecatnews.com and made it one of the few uncensored places where adherents and sceptics of this field could discuss (sometimes raucously) without the fear of being censored.

But Paul closed shop after he came to the conclusion that the E-Cat is just a scam. And this is where my addiction problem comes in. His blog was where I got my daily LENR dose, and the other blogs that still cover this story are by far less open and critical to be an adequate substitute. So in order to manage my addiction I have created a sub-blog, called Wavewatching Fringe Edition. This new addition is by no means supposed to take away the focus of this main blog, but rather help to manage my fringe science problem, and possibly serve as a resource to warn people to double check before investing in fringe science projects.

Be warned though, fringe science is addictive, it offers stories taller and more imaginative than any soap opera. If you want to stay clean, stay clear of the fringe.

Update

After losing a FB “Like” I feel like clarifying what I classify as “fringe science”.  To have an objective criteria I lump everything into this category that doesn’t flow from efforts published in reputable peer reviewed journals (creating new journals in order to get published doesn’t qualify). Since everything performed by humans is far from infallible, peer review can miss interesting things, but the signal to noise ratio in the fringe category will be much higher.

Similar as with my “Lost Papers” section I will try to focus on aspects that maybe shouldn’t be overlooked. But there is also the additional aspect that I focused on above. Old Hilbert papers make a very bad basis to solicite investments funds, on the other hand many of the hotter fringe science topics virtually spawn their own industries (that usually go nowhere).  If somebody researches these topics because they’ve been approached for investment funds then I hope the fringe section will paint a critical and realistic picture.

Of course it’ll be great if something as controversial as LENR could get to the point were repeatable, scalable experiments with a proper theoretical under-pinning brings it back to physics’ forefront.  Some LENR proponents feverishly believe that this is already the case.  Obviously I disagree, but I am not entirely ruling out that it could happen.