Blog Memory Hole Rescue – The Fun is Real

It seems that work and life is conspiring to leave me no time to finish my write-up on my General Fusion visit.  Started it weeks ago but still I am not ready to hit the publish button on this piece.

memory_hole

In the meantime I highly recommend the following blog that I came across.  It covers very similar topics than the ones here, and also shares a similar outlook.  For instance, this article beautifully sums up why I never warmed up to Everett's Multiverse interpretation (although I have to admit reading Julian Barbour's End of Time softened my stance a bit - more on this later).

The 'Fun Is Real' blog is a cornucopia of good physics writing and should provide many hours of thought-provoking reading material to bridge over the dearth of my current posting schedule.

On a side note, given that this goes to the core of the topic I write about on this blog, the following news should not go unmentioned:  Australian researchers reportedly have created a cluster state of 10,000 entangled photonic qubits (h/t Raptis T.).

This is magnitudes more than has been previously reported. Now if they were to manage to get some quantum gates applied to them we'd be getting somewhere.

Posted in Blogroll Rescue, Popular Science, Quantum Computing, Quantum Mechanics | 7 Comments

Quantum Computing Interrupted

This is the second part of my write-up on my recent visit to D-Wave. The first one can be found here.

d_wave_close

The recent shut-down of the US government had wide spread repercussions. One of the side-effects was that NASA had to stop all non-essential activities and this included quantum computing.  So the venture which, in cooperation with Google, jointly operates a D-Wave machine was left in limbo for a while.  Fortunately, this was short lived enough to hopefully not have any lasting adverse effects.  At any rate, maybe it freed up some time to produce a QC mod for Minecraft and the following high level and very artsy Google video that 'explains' why they want to do quantum computing in the first place.

If you haven't been raised on MTV music videos and find rapid succession sub-second cuts migraine inducing (at about the 2:30 mark things settle down a bit), you may want to skip it. So here's the synopsis (Spoiler alert). The short version of what motivates Google in this endeavor, to paraphrase their own words: We research quantum computing, because we must.

In other news, D-Wave recently transferred its foundry process to a new location, partnering with Cypress Semiconductor Corp, a reminder that D-Wave firmly raised the production of superconducting Niobium circuitry to a new industrial-scale level.  Given these new capabilities, it may not be a coincidence that the NSA has recently announced its intention to fund research into super-conducting computing. Depending on how they define "small-scale" the D-Wave machine should already fall into the description of the solicitation bid, which aspires to the following ...

"... to demonstrate a small-scale computer based on superconducting logic and cryogenic memory that is energy efficient, scalable, and able to solve interesting problems."

... although it is fair to assume this program is aimed at classical computing. Prototypes for such chips have been already researched and look rather impressive (direct link to paper).  They are using the same chip material and circuitry (Josephson junctions) as D-Wave, so it is not a stretch to consider that industrial scale production of those more conventional chips can immediately benefit from the foundry process know-how that D-Wave has accumulated. It doesn't seem too much of a stretch to imagine that D-Wave may expand into this market space.

When putting the question to D-Wave's CTO Geordie Rose, he certainly took some pride in his company's manufacturing expertise. He stressed that, before D-Wave, nobody was able to scale superconducting VLSI chip production, so this now opens up many additional opportunities. He pointed out that one could, for instance, make an immediate business case for a high through-put router based on this technology, but given the many venues open for growth he stressed the need to chose wisely.

The capacity of the D-Wave fridges are certainly so that they could accommodate more super-conducting hardware. Starting with the Vesuvius chip generation, measurement heat is now generated far away from the chip. Having several in close proximity should therefore not disturb the thermal equilibrium at the core.  Geordie considers deploying stacks of quantum chips so that thousands could work in parallel, since they are currently just throwing away a lot of perfectly good chips that come off a wafer.  This may eventually necessitate larger cooling units than the current ones that draw 16KW. This approach certainly could make a lot of sense for a hosting model where processing time is rented out to several customers in parallel.

One attractive feature that I pointed out was that if you had classical logic within the box, you'd eliminate a potential bottleneck that could occur if rapid reinitialization and read out of the quantum chip is required, and it would also potentially open the possibility for direct optical interconnects between chips. Geordie seemed to like this idea. One of the challenges to make the current wired design work, was to design high efficiency low pass filters to bring the noise level in these connectors down to an acceptable level.  So, in a sense, an optical interconnect could reduce complexity, but clearly would also require some additional research effort to bring down the heat signature of such an optical transmission.

This triggered an interesting, and somewhat ironic, observation on the challenges of managing an extremely creative group of people.  Geordie pointed out that he has to  think carefully about what to focus his team on, because an attractive side project e.g. 'adiabatic' optical interconnects, could prove to be so interesting to many team members that they'd gravitate towards working on this rather than keeping their focus on the other work at hand.

Some other managerial headaches stem from the rapid development cycles.  For instance, Geordie would like to develop some training program that will allow a customer's technical staff to be quickly brought up to speed.  But by the time such a program is fully developed, chances are a new chip generation will be ready and necessitate a rewrite of any training material.

Some of  D-Wave's challenges are typical for high tech start ups, others specific to D-Wave. My next, and final, installment will focus on Geordie's approach to managing these growing pains.

Posted in D-Wave | 13 Comments

Blog Round-Up

Lots of travel last week delayed the second installment on my D-Wave visit write-up, but I came across some worthy re-blog material to bridge the gap.

inholeI am usually very hard on poorly written popular science articles, which is all the more reason to point to some outstanding material in this area. I found that one writer, Brian Dodson, at the Gizmag site usually delivers excellent content. Due to his science background, he brings an unusual depth of understanding to his writing. His latest pieces are on General Relativity compatible alternatives to dark energy and a theoretical Quantum black hole study that puts the gravity loop approach to some good use. The latter is a good example as to why I am much more inclined to Loop Quantum Gravity rather than the ephemeral String theory, as the former at least delivers some predictions.

Another constant topic of this blog is the unsatisfying situation with regards to the foundational interpretations of Quantum Mechanics.  Lack of progress in this area can in no small measure be attributed to the 'Shut up and calculate' doctrine, a famous  quip attributed to Feynman that has since been enshrined as an almost iron rule.

To get a taste for how prohibitively this attitude permeates the physics community, this arxiv paper/rant is a must read. From the abstract:

If you have a restless intellect, it is very likely that you have played at some point with the idea of investigating the meaning and conceptual foundations of quantum mechanics. It is also probable (albeit not certain) that your intentions have been stopped in their tracks by an encounter with some version of the “Shut up and calculate!” command. You may have heard that everything is already understood. That understanding is not your job. Or, if it is, it is either impossible or very difficult. Maybe somebody explained to you that physics is concerned with “hows” and not with “whys”; that whys are the business of “philosophy” -you know, that dirty word. That what you call “understanding” is just being Newtonian; which of course you cannot ask quantum mechanics to be. Perhaps they also complemented this useful advice with some norms: The important thing a theory must do is predict; a theory must only talk about measurable quantities. It may also be the case that you almost asked “OK, and why is that?”, but you finally bit your tongue. If you persisted in your intentions and the debate got a little heated up, it is even possible that it was suggested that you suffered of some type of moral or epistemic weakness that tends to disappear as you grow up. Maybe you received some job advice such as “Don’t work in that if you ever want to own a house”.

At least if this bog post is any indication the times seem to be changing and becoming more permissive.

Posted in Popular Science, Quantum Mechanics | Tagged , | 1 Comment

The D-Wave Phenomenon

This is my first installment of the write-up on my recent visit to D-Wave in Burnaby, BC.

No matter where you stand on the merits of D-Wave technology, there is no doubt they have already made computing history. Transistors have been the sole basis for our rapidly improving information technology since the last vacuum tube computer was sold in the early sixties.  That is, until D-Wave started to ship their first system. Having won business from the likes of NASA and Google, this company is now playing in a different league. D‑Wave now gets to present at high profile events such as the HPC IDP conference,  and I strongly suspect that they caught the eye of the bigger players in this market.

The entire multi-billion dollar IT industry is attached at the hip to the existing computing paradigm, and abhors cannibalize existing revenue streams. This is why I am quite certain that as I write this, SWOT assessments and talking-points on D-Wave are being typed up in some nondescript Fortune 500 office buildings (relying on corporate research papers like this to give them substance).  After all, ignoring them is no longer an option.  Large companies like to milk cash cows as long as possible.  An innovative powerhouse like IBM, for instance, often follows the pattern to invest in R&D up to productization, but they are prone to holding back even superior technology if it may jeopardize existing lines of business. Rather, they just wait until a competitor forces their hand, and then they rely on their size and market depth, in combination with their quietly acquired IP, to squash or co-opt them. They excel at this game and seldom lose it (it took somebody as nimble and ruthless as Bill Gates to beat them once).

This challenging competitive landscape weighed on my mind when I recently had an opportunity to sit down with D-Wave founder and CTO Geordie Rose, and so our talk first focused on D-Wave's competitive position.  I expected that patent protection and technological barriers of entry would dominate this part of our conversation, and was very surprised about Geordie's stance, which certainly defied conventional thinking.

geordie_in_box

Geordie Rose founder and CTO of D-Wave in one of the Tardis-sized boxes that host his quantum chip. The interior is cooled close to absolute zero when in operation. If you subscribe to the multiverse interpretation of quantum mechanics one may argue that it then will in fact be bigger on the inside. After all, the Hilbert space is a big place.

While he acknowledged the usefulness of the over 100 patents that D-Wave holds,  he only considers them to be effectively enforceable in geographies like North America. Overall, he does not consider them an effective edge to keep out competition, but was rather sanguine that the fate of any computing hardware is to eventually become commoditized. He asserted that the academic community misjudged how hard it would be produce a device like the D-Wave machine.  Now that D-Wave has paved the way, he considers a cloning and reverse engineering of this technology to be fairly straightforward.  One possible scenario would be a government funded QC effort in another geography to incubate this new kind of information processing.  In the latter case, patent litigation will be expensive, and may ultimately be futile.  Yet, he doesn't expect these kind of competitive efforts unless D-Wave's technology has further matured and proven its viability in the market place.

I submitted that the academic push-back that spreads some FUD with regards to their capabilities, may actually help in this regard. This prompted a short exchange on the disconnect with some of the academic QC community.  D-Wave will continue to make it's case with additional publication to demonstrate entanglement and the true quantum nature of their processor.  But ultimately this is a side-show, the research focus is customer driven and to the extent that this means deep learning (e.g. for pattern recognition) the use case of the D-Wave chip is evolving.  Rather than only using it as an optimization engine, Geordie explained how multiple solution runs can be used to sample the problem space of a learning problem and facilitate more robust learning.

It is the speed of customer driven innovation that Geordie relies on giving D-Wave a sustainable edge, and ultimately he expects that software and services for his platform will prove to be the key to a sustainable business.  The current preferred mode of customer engagement is what D-Wave calls a deep partnership, i.e. working in very close collaboration with the customer's staff. Yet, as the customer base grows, more management challenges appear, since clear lines have to be drawn to mark where the customer's intellectual property ends and D-Wave's begins. The company has to be able to re-sell solutions tied to its architecture.

D-Wave experiences some typical growing pains of a successful organization, and some unique high tech challenges in managing growth. How Geordie envisions tackling those will be the subject of the next installment.

 

 

 

 

 

Posted in D-Wave, Quantum Computing | 9 Comments

Science – Don’t Ask What it Can Do for You.

The most important non-scientific book about science that you will ever read is Michael Nielsen's Reinventing Discovery: The New Era of Networked Science.

It lays out how the current scientific publishing process is a hold over from the 19th century and passionately makes the case for Open Science.  The latter is mostly understood to be synonymous with Open Access, i.e. no more hiding of scientific results in prohibitively expensive journals, especially when public tax funded grants or institutions paid for the research.

But Michael has a more expansive view.  He makes the case that science can be measurably enriched by coming out of the Ivory tower and engaging the public via well designed crowdsourcing efforts such as the Galaxy Zoo.

On this blog, I have written many times about the shortcomings of science media large and small, as well as the unsatisfying status quo in theoretical physics.  And readers may be justified in wondering why this should matter to them. The answer to this is straightforward:  Science is too important for it to be left to the scientists.  Our society is shaped by science and technology, and to the extent that we've all learned about the scientific method, everybody has the capacity to raise valid questions.  Science, as any other major endeavor, benefits from a critical public, and that is why the fairytale science that I wrote about in my last post is a dangerous development.  It lulls the interested observers into believing that they are clearly out of their depth, incapable of even formulating some probing questions.  This can in fact be turned into a criteria for bad science: If a reasonably intelligent and educated person cannot follow up with some questions after a science presentation, it's a pretty good indication that the latter is either very poorly done, or may deal in fairytale science (the only problem with this criteria is that usually everybody considers themselves reasonably intelligent).

The antidote to this pathological development is Open Science as described by Michael Nielsen and Citizen Science. The latter I'll expect to develop no less of an impact on the way we do science as the Open Source movement had on the way we do computing. Never have the means to do quality science been as affordable as today; A simple smartphone is already a pretty close match to the fabled Star Trek tricorder, and can easily be turned into precision instruments. Science labs used to require skilled craftsmen to build scientific rigs, but 3D printers will level the playing field there as well.  This means that experiments that would have required major funding just two decades away are now within the means of high school students.

So, don't ask what science can do for you, but what you can do for science.*

Don_t_ask*In this spirit, I decided to step up this blog's content, and didn't shy away from the expenses to engage in some original reporting.  Last week I took a trip to Canada's high tech wonderland, which happens to be Burnaby, BC just outside Vancouver.  Stay tuned for some upcoming first hand reporting on D-Wave and General Fusion.

Posted in Popular Science | 12 Comments

Just Say No to Fairytale Science

ScienceUntil recently, there was no competition if I were to be asked what popular science book I'd recommend to a non-physicist.   It was always Terry Pratchett's The Science of Discworld. It comes with a healthy dose of humor and makes no qualms about the fact that any popularized version of modern physics essentially boils down to "lies to children".

farewell-to-reality

 

 

 

But there is now a new contender, one that I can highly recommend:  Farewell to Reality: How Modern Physics Has Betrayed the Search for Scientific Truth. This book does an excellent job of retelling how we got to the current state in theoretical physics, that quantum computing theorist Scott Aaronson described this way:

 

ROTFL! Have you looked recently at beyond-Standard-Model theoretical physics? It’s a teetering tower of conjectures (which is not to say, of course, that that’s inherently bad, or that I can do better). However, one obvious difference is that the physicists don’t call them conjectures, as mathematicians or computer scientists would. Instead they call them amazing discoveries, striking dualities, remarkable coincidences, tantalizing hints … once again, lots of good PR lessons for us! :-)

This was in a comment to his recent blog post where he has some fun with Nima Arkani-Hamed's Amplituhedron. The latter is actually some of the more interesting results I have seen come out of mainstream theoretical physics, because it actually allows us to calculate something in a much more straightforward manner than before. That this is currently unfortunately restricted to the scope of an unphysical toy theory is all you need to know to understand how far current theoretical physics has ventured from actual verifiability by experiment.

For those who want to dig deeper and understand where to draw the line between current established physics and fairytale science, Jim Baggot's book is a one stop shop.  It is written in a very accessible manner and does a surprisingly good job in explaining what has been driving theoretical physics, without recourse to any math.

At the beginning, the author describes what prompted him to write the book: one too many of those fanciful produced science shows, with lots of CGI and dramatic music, that presents String theory as established fact.  Catching himself yelling at the TV (I've been there), he decided to do something about it, and his book is the pleasant result.  I am confident it will inoculate any alert reader to the pitfalls of fairytale science and equip him (or her) with a critical framework to ascertain what truthiness to assign to various theoretical physics conjectures (in popular science fare they are, of course, never referenced as such, as Scott correctly observed).

This isn't the first book that addresses this issue.  Peter Woit's Not Even Wrong took it on, at a time when calling out String theory was a rather unpopular stance, but the timing for another book in this vein that addresses a broad lay public is excellent.  As Baggott wrote his book, it was already apparent that CERN's LHR did not pick up any signs in support of SUSY and string theory.  Theorists have been long in denial about these elaborately constructed castles in the sky, but the reality seems to be slowly seeping in.

The point is that the scientific mechanism for self-correction needs to reassert itself.  It's not that SUSY and String theory didn't produce some remarkable mathematical results.  They just didn't produce actual physics (although in unanticipated ways the Amplituhedron may get there). Trying to spin away this failure is doing science a massive disfavor. Let's hope theoretical physicists take a cue from the the quantum computing theorists and clearly mark their conjectures. It'll be a start.

Alternatively, they could always present the theory as it is done in this viral video.  At least then it will be abundantly clear that this is more art than science (h/t Rolf D.):

Posted in Popular Science | 14 Comments

Science Media in a Bubble – Ready to Implode?

An ongoing theme of this blog is the media coverage that science receives. Unsurprisingly, given that most journalists have little STEM background, the public is often treated to heedless rewording of press releases e.g. this example from the QC realm. Also, sensationalist science news is hardly ever put into context - the story of the faster than light CERN neutrinos is a perfect example for the latter.

What is more surprising is when dedicated publication powerhouses such as Nature or Science are getting it wrong. Either by means of omission, such as when covering quantum computing but completely ignoring the adiabatic approach that D-Wave is following, or by short-circuiting the peer review process.  The latter may have set back sonoluminescence research by decades.

Sonoluminescence is the name for a peculiar effect where cavitation in a liquid can be stimulated by sound waves to the point where the small gaseous bubbles implode so rapidly that plasma forms that produces a telltale light signal. The following video is a nice demonstration of the effect (full screen mode recommended):

 

Since there is plasma involved, the idea that this could be used as yet another means to accomplish fusion was first patented as early as 1982.

In itself, the phenomenon is remarkable enough, and not well understood, giving ample justification for basic research of the effect.  After all, it is quite extraordinary that sound waves suffice to create such extreme conditions in a liquid.

But it is still quite a stretch to get from there to the necessary conditions for a fusion reaction.  The nuclear energy barrier is orders of magnitudes larger than the energies that are involved in the chemical domain, let alone the typical energy density of sound waves.  The following cartoon puts this nicely into perspective:

That is why to me this approach to fusion always seemed rather far fetched, and not very practical. So when a Science article about ten years ago claimed fusion evidence, I was skeptical, and wasn't surprised that it was later contradicted by reports that portrayed the earlier results as ambiguous at best.  I had no reason to question the Science reporting.  I took the news at face value and paid little attention to this area of research until a recent report by Steven Krivit.  He brings investigative journalism to the domain of science reporting and the results are not pretty:

  1. The rebuttal to the original peer reviewed article first appeared on the Science blog without going through the usual review process.
  2. Contrary to what was reported, the scientists undermining the original research did not work independently on reproducing the results but only performed auxiliary measurements on the same experimental set-up.
  3. The detector they used was known to not be ideally suited to the neutron spectrum that was to be measured, and was too large to be ideally placed.
  4. The criticism relied on an ad-hoc coincidence criteria for the photon and neutron genesis that ignored the multi-bubble cavitation design of the original experiment.

The full investigative report is behind a pay-wall.  It is rather devastating.

To add insult to injury, the Science journalist instrumental in causing this mess, the one who promoted the rebuttal without peer review, later went on to teach journalism.

A casual and cynical observer may wonder why Steven makes such a fuss about this. After all, American mainstream journalism outside the realm of science is also a rather poor and sordid affair.  He-said-she-said reporting is equated with objectivity, and journalists are mostly reduced to being stenographers and water carriers of the political actors that they are supposed to cover (the few journalists who buck this trend I hold in the highest regard).

One may also argue that there wasn't all that much damage done, because the critics, even if they didn't work as advertised, may have gotten it right; The BBC, a couple of years later, sponsored an attempt at reproduction and also came up empty.

But there is one rather big and important difference:  Journals such as Science are not just media that report to the public at large.  Rather, they are the gatekeepers for what is accepted as scientific research, and must therefore be held to a higher standard.  Research that doesn't get published in peer reviewed journals may as well not exist (unless it is privately financed applied R&D, that can be immediately commercialized, and is therefore deliberately kept proprietary).

The more reputable a peer reviewed journal, the higher the impact (calculating the impact factor is a science in itself). But arguably, it is worse to get work published in a reputable journal just to have the results then demolished by the same outfit, especially if the deck is stacked against you.

To me, this story raises a lot of questions and drives home that investigative science journalism is sorely lacking and badly needed. Who else is there to guard the gatekeepers?

Posted in Popular Science | 5 Comments

Out of the AI Winter and into the Cold

dwave_log_temp_scale

A logarithmic scale doesn't have the appropriate visual impact to convey how extraordinarily cold 20mK is.

Any quantum computer using superconducting Josephson junctions will have to be operated at extremely low temperatures. The D-Wave machine, for instance, runs at about 20 mK, which is much colder than anything in nature (including deep space). A logarithmic scale like the chart to the right, while technically correct, doesn't really do this justice.  This animated one from D-Wave's blog shows this much more drastically when scaled linearly (the first link goes to an SVG file that should play in all modern browsers, but it takes ten seconds to get started).

Given that D-Wave's most prominent use case is the field of machine learning, a casual observer may be misled to think that the term "AI winter" refers to the propensity of artificial neural networks to blossom in this frigid environment. But what the term actually stands for is the brutal hype cycle that ravaged this field of computer science.

One of the original first casualties of the collapse of artificial intelligence research in 1969 was the ancestor of the kind of learning algorithms that are now often implemented on D-Wave's machines. This incident is referred to as the XOR affair, and the story that circulates goes like this:  "Marvin Minsky, being a proponent of structured AI, killed off the connectionism approach when he co-authored the now classic tome, Perceptrons. This was accomplished by mathematically proving that a single layer perceptron is so limited it cannot even be used (or trained for that matter) to emulate an XOR gate. Although this does not hold for multi-layer perceptrons, his word was taken as gospel, and smothered this promising field in its infancy."

Marvin Minsky begs to differ, and argues that he of course knew about the capabilities of artificial neural networks with more than one layer, and that if anything, only the proof that working with local neurons comes at the cost of some universality should have had any bearing.  It seems impossible to untangle the exact dynamics that led to this most unfortunate AI winter, yet in hindsight it seems completely misguided and avoidable, given that a learning algorithm (Backpropagation) that allowed for the efficient training of multi-layer perceptrons had already been published a year prior, but at the time it received very little attention.

The fact is, after Perceptrons was published, symbolic AI flourished and connectionism was almost dead for a decade. Given what the authors wrote in the forward to the revised 1989 edition, there is little doubt how Minsky felt about this:

"Some readers may be shocked to hear it said that little of significance has happened in this field [since the first edition twenty year earlier]. Have not perceptron-like networks under the new name connectionism - become a major subject of discussion at gatherings of psychologists and computer scientists? Has not there been a "connectionist revolution?" Certainly yes, in that there is a great deal of interest and discussion. Possibly yes, in the sense that discoveries have been made that may, in time, turn out to be of fundamental importance. But certainly no, in that there has been little clear-cut change in the conceptual basis of the field. The issues that give rise to excitement today seem much the same as those that were responsible for previous rounds of excitement. The issues that were then obscure remain obscure today because no one yet knows how to tell which of the present discoveries are fundamental and which are superficial. Our position remains what it was when we wrote the book: We believe this realm of work to be immensely important and rich, but we expect its growth to require a degree of critical analysis that its more romantic advocates have always been reluctant to pursue - perhaps because the spirit of connectionism seems itself to go somewhat against the grain of analytic rigor." [Emphasis added by the blog author]

When fast-forwarding to 2013 and the reception that D-Wave receives from some academic quarters, this feels like deja-vu all over again. Geordie Rose, founder and current CTO of D-Wave, unabashedly muses about spiritual machines, although he doesn't strike me as a particularly romantic fellow. But he is very interested in using his amazing hardware to make for better machine learning, very much in "the spirit of connectionism".  He published an excellent mini-series on this at D-Wave's blog (part 1, 2, 3, 4, 5, 6, 7).  It would be interesting to learn if Minsky was to find fault with the analytic rigor on display here (He is now 86 but I hope he is still going as strong as ten years ago when this TED talk was recorded).

So, if we cast Geordie in the role of the 21st century version of Frank Rosenblatt (the inventor of the original perceptron) then we surely must pick Scott Aaronson as the modern day version of Marvin Minsky.  Only that the argument this time is not about AI, but how 'quantum' D-Wave's device truly is.  The argument feels very similar: On one side, the theoretical computer scientist, equipped with boat-loads of mathematical rigor, strongly prefers the gate model of quantum computing. On the other one, the pragmatist, whose focus is to build something usable within the constraints of what chip foundries can produce at this time.

But the ultimate irony, it seems, at least in Scott Aaronson's mind, is that the AI winter is the ultimate parable of warning to make his case (as was pointed out by an anonymous poster to his blog).  I.e. he thinks the D-Wave marketing hype can be equated to the over-promises of AI research in the past. Scott fears that if the company cannot deliver, the babe (I.e. Quantum Computing) will be thrown out with the bathwater, and so he blogged:

“I predict that the very same people now hyping D-Wave will turn around and—without the slightest acknowledgment of error on their part—declare that the entire field of quantum computing has now been unmasked as a mirage, a scam, and a chimera.”

A statement that of course didn't go unchallenged in the comment section (Scott's exemplary in allowing this kind of frankness on his blog).

I don't pretend to have any deeper conclusion to draw from these observations, and will leave it at this sobering thought: While we expect science to be conducted in an eminently rational fashion, history gives ample examples of how the progress of science happens in fits and starts and is anything but rational.

Posted in D-Wave, Popular Science | Tagged , , , , , , | 23 Comments

The Other Kind of Cold Fusion

Cygnus_X-1

Nature clearly favours hot fusion no matter how cold the light. The cold glow in this image stems from a Blue Giant that is believed to orbit a black hole in the Cygnus X-1 system.

If you lived through the eighties there are certain things you could not miss, and since this is a science blog I am of course not referring to fashion aberrations, like mullets and shoulder pads, but rather to what is widely regarded as one of the most notorious science scandals to date: Fleischmann and Pons Cold Fusion, the claim of tapping the ultimate energy source within a simple electrochemical cell.

driver_license_photo

This blog's author's photo proves that he lived through the eighties. Since this driver's licence picture was taken the same year as the Fleischmann and Pons disaster, the half smile was all that I could muster.

For a short time it felt like humanity's prayers to deliver us from fossil fuel had been answered (at least to those who believe in that sort of thing). Of course, paying the ever increasing price at the gas pump is a constant (painful) reminder that this euphoric moment at the end of eighties was but a short lived aberration. But back then it felt so real. After all, there already existed a well-known process that allowed for nuclear fusion at room temperature, catalyzed by the enigmatic muons. One of the first scientific articles that I read in English was on that phenomenon, and it was published just a couple of years earlier. So initial speculations abounded, that maybe muons in the cosmic background radiation could somehow help trigger the reported reaction (although there was no explanation given as to how this low muon flux density could possibly accomplish this). While my fringe blog focuses on the intrepid researchers who, despite the enormous blow back, still work on Fleischman Pons-style research, this post is about the former, the oft forgotten muon-catalyzed fusion.

It is a beautiful nuclear reaction, highlighting one of the most basic peculiarities of quantum mechanics: Quantum Tunnelling and Heisenberg uncertainty principle. Both of these are direct consequences of the manifest wave properties of matter at this scale. The former allows matter to seep into what should be impenetrable barriers, and the latter describes how a bound point particle is always "smeared out" over a volume - as if points are an abstraction that nature abhors. Last but not least, it showcases the mysterious muon, a particle that seems to be identical to electrons in every way but the mass and stability (about 200 times more mass and a pretty long half life of about 2 μs). Because it behaves just like a heavier twin of the electron, it can substitute the latter in atoms and molecules.

The Heisenberg uncertainty principle states that the product of momentum (mass times velocity) and position 'certainty' has a lower bound. Usually the term uncertainty is simply interpreted probabilistically in terms of the deviation of the expectation value. But this view, while formally entirely correct, obstructs the very real physical implication of trying to squeeze a particle into a small space, because the momentum uncertainty then becomes a very real physical effect of quantum matter. The particle's velocity distribution will become ever broader, forcing the matter outwards and creating an orbital 'cloud' (e.g. specifically the spherical hydrogen s-orbital). There is really no good analogy in our everyday experience, they all sound silly: My version is that of a slippery soap in a spherical sink, the harder you try to grasp it the more powerful you send it flying. If you were to map all trajectories of the soap over time, you will find that on average it was anywhere in the sink with the probability decreasing towards the rim (that is unless you squeeze it so hard that it acquires enough speed to jump out of the sink - I guess that would be an analog to ionization). In the atomic and chemical realm, on the other hand, the very concept of a trajectory doesn't hold up (unless you are dealing with Rydberg atoms). You may as well think of electron orbitals as charge distributions (as this is exactly how they behave in the chemical domain).

Because the momentum rather then the velocity enters into the equation, the orbitals for a heavier version of the electron will be considerably smaller, i.e. about 2oo times smaller for the muon, as this is the factor by which the particle's velocity can be reduced in order to still get the same momentum. So muonic hydrogen is much smaller than the electron version. That's already all that is needed to get fusion going, because if two heavy hydrogen nucleons are bound in a muonic μH2 molecule they are far too close for comfort. Usually the repellent force of the electrostatic Coulomb potential should be enough to keep them apart, but the quantum tunnel effect allows them to penetrate the 'forbidden' region. And at this distance, the probability that both nucleons occupy the same space becomes large enough to get measurable incidents of nuclear fusion i.e. μH→ μHe.

The hydrogen used in the experimental realization is not the usual kind, but as with other fusion realizations, the heavier hydrogen isotopes deuterium and tritium are required, and since there is only one muon in the mix the d-t hydrogen is ionized. so that the equation looks more like this: (d-μ-t)+ → n + α (with the n indicating a fast neutron and the α a Helium-4 nucleus.)

The latter causes a lot of trouble as the muon 'sticks' to this alpha particle with a 1% chance (making it a muonic helium ion). If this happens, this muon is no longer available to catalyze more fusion events. This, in combination with the limited life time of the muons, and the 'set-up' required by the muons to bind to the hydrogen isotopes, is the limiting factor of this reaction.

Without a constant massive resupply of muons the reaction tempers off quickly. Despite decades of research this problem could never be surmounted. It takes pions to make muons, and the former are only produced in high energy particle collisions. This costs significantly more energy than the cold muon catalyzed fusion can recoup.

But there is one Australian company that claims that it has found a new, less costly way to make pions. They are certainly a very interesting 'cold fusion' start-up and at first glance seem far more credible than the outfits that my fringe blog covers. But on the other hand, this company treats their proprietary pion production process with a level of secrecy that is reminiscent of the worst players in the LENR world. I could not find any hint of how this process is supposed to work and why it supposedly can produce sufficient amounts of muons to make this commercially exploitable. (Pions could also be generated in two photon processes, but this would require even more input energy). So on second read the claims of Australian's Star Scientific don't really sound any less fantastic than the boasting of any other cold fusion outfit.

Any comments that could illuminate this mystery are more than welcome. Preliminary google searches on this company are certainly not encouraging.

Posted in Popular Science, Quantum Mechanics | Tagged , | 4 Comments

Coming Up Swinging

The current top political news of the day (Snowden leak) brings into sharp relief why encryption and the capabilities to break it receive so much attention.

It puts into context why a single algorithm (Shor's) accounts for most of quantum computing's notoriety and why quantum encryption receives so much funding.

All the more surprising then that Laszlo Kish's work received comparatively little attention.  After all, it poses a direct challenge to the field's claim to offer the only communication channels where being tamper-proof is baked into the very protocol.  Yet, with Charles Bennett et al. going for a knock-out, this decidedly changed.  Now this exciting 'match' goes to the next round with the Kish et al. follow up paper, and it is quite the come-back.  From the abstract:

Recently, Bennett and Riedel (BR) argued that thermodynamics is not essential in the Kirchhoff-law–Johnson-noise (KLJN) classical physical cryptographic exchange method in an effort to disprove the security of the KLJN scheme. They attempted to prove this by introducing a dissipation-free deterministic key exchange method with two batteries and two switches. In the present paper, we first show that BR’s scheme is unphysical and that some elements of its assumptions violate basic protocols of secure communication. Furthermore we crack the BR system with 100% success via passive attacks, in ten different ways, and demonstrate that the same cracking methods do not function for the KLJN scheme that employs Johnson noise to provide security underpinned by the Second Law of Thermodynamics. We also present a critical analysis of some other claims by BR; for example, we prove that their equations for describing zero security do not apply to the KLJN scheme. Finally we give mathematical security proofs for each of the attacks on the BR scheme and conclude that the information theoretic (unconditional) security of the KLJN method has not been successfully challenged.

The original post on this subject resulted in a high quality follow-up discussion on LinkedIn that I hope may get triggered again.  After all, science is more fun as a spectator sport with well-informed running commentary.

Posted in Quantum Cryptography | Tagged , , , , | 4 Comments