Category Archives: Uncategorized

Taking it to the Next Level

        The summit of the Computing Pyramid is no longer just an academic specter.

It is no secret that I’ve been always impressed with D-Wave. Sure, they “only” perform quantum annealing on their chip, and there is no guarantee that there is any “quantum supremacy” to be had, but with their machine, qubits can be harnessed for the first time to do something useful. I.e D-Wave now offers enough computational power to hold its own in comparison to the established computing architecture. It’s unrealistic to expect it to be the silver bullet for all optimization problems, but it is for real and it is available, and as soon as something can be actually useful you can put a price on it.

My company is in the business of developing Open Source software in the QC space, and to offer advice and consulting to customers who want to assess and explore the possibilities in this new frontier of Information Technology. Our software can already talk to the IBM Quantum Experience Chip, a universal gate based chip which is an impressive proof of concept, but is not yet of any practical use. It did not sit well with me that we could not address the D-Wave API in a similar manner.

That’s why I picked up the phone and reached out to D-Wave to establish a relationship that will allow my company, artiste-qb.net, to do just that.

So while I will always strive for full transparency when discussing quantum information technologies, when I am writing about D-Wave in the future it will no longer be the vantage point of an unaffiliated observer, but rather the perspective of someone who will work actively to help them succeed on the merits of their technology.

Canadian PM Justin Trudeau talks Quantum Computing

He is already fluently bi-lingual, but he also speaks pretty good Quantum.  This isn’t half bad for a head of state:

If you also want to impress your friends like this, I recommend Sabine Hossenfelder’s Quantum lingo crash course.

This bodes well for the prospects of seeing some federal initiatives for the emerging Canadian QC industry in the not too distant future.

 

We need Big Data where it will actually make a difference

Katmadu_detroyed_temples

Another earthquake took the lives of many thousands. As I am writing this blog post, scores of survivors will still be trapped underneath debris and rubble.

It will take weeks, if not months, before the damage done to Nepal will become fully apparent, in terms of life and limbs but also economically and spiritually.

The world’s poorest regions are often hardest hit because resilient structures that can withstand quakes of this magnitude are expensive.

Governments look to science to provide better earthquake warnings, but the progress of geophysical modeling is hampered by the lack of good, high quality data.

In this context, pushing the limits of remote sensing with new technologies such as Quantum Gravimeters becomes a matter of life and death, and it should make apparent that striving for ever more precise quantum clocks is anything but a vanity chase. After all we are just now closing in on the the level of accuracy needed to perform relativistic geodesy.

It goes without saying that the resource extraction industry will be among the first to profit from these new techniques.  While this industry has an image problem due to its less than stellar environmental track record, there’s no denying that anything that drives the rapid and ongoing productization of these technologies is a net positive if that makes them affordable and widely accessible to geophysicists who study the dynamic of active fault lines. Acquiring this kind of big data is the only chance to ever achieve a future when our planet will no longer shock us with its deadly geological force.

Dumbing Down for Smartphones

Google changed its site ranking, if a site is not mobile friendly it will now be heavily penalized. I was quite fond of my old design but when running the Google Mobile test it failed miserably.  Hence a hasty redesign based on a newer WordPress theme was in order.

Screenshot 2015-04-20 01.43.20
Goodby my beloved theme, Google and that nasty smartphone killed you.

 

 

Je me souviens

Usually I don’ t post anything political here.  This time I make an exception.  I hope it will remain the only one.

 

The Google-Martinis Chip Will Perform Quantum Annealing

Ever since the news that John M. Martinis will join Google to develop a chip based on the work that has been performed at UCSB, speculations abound as to what kind of quantum architecture this chip will implement.  According to this report, it is clear now that it will be adiabatic quantum computing:

But examining the D-Wave results led to the Google partnership. D-Wave uses a process called quantum annealing. Annealing translates the problem into a set of peaks and valleys, and uses a property called quantum tunneling to drill though the hills to find the lowest valley. The approach limits the device to solving certain kinds of optimization problems rather than being a generalized computer, but it could also speed up progress toward a commercial machine. Martinis was intrigued by what might be possible if the group combined some of the annealing in the D-Wave machine with his own group’s advances in error correction and coherence time.
“There are some indications they’re not going to get a quantum speed up, and there are some indications they are. It’s still kind of an open question, but it’s definitely an interesting question,” Martinis said. “Looking at that, we decided it would be really interesting to start another hardware approach, looking at the quantum annealer but basing it on our fabrication technology, where we have qubits with very long memory times.”

This leads to the next question: Will this Google chip be indeed similarly restricted to implementing the Ising model like D-Wave, or strive for more universal adiabatic quantum computation? The later has theoretically been shown to be computationally equivalent to gate based QC. It seems odd to just aim for a marginal improvement of the existing architecture as this article implicates.

At any rate, D-Wave may retain the lead in qubit numbers for the foreseeable future if it sticks to no, or less costly, error correction schemes (leaving it to the coders to create their own). It will be interesting to eventually compare which approach will offer more practical benefits.

So You Want to Learn About Quantum Computing?

“Students will learn by inhabiting an alternate history where Alan Turing and Richard Feynman meet during World War II and must invent quantum computers to defeat Nazi Germany. As a final project, they will get to program a D-Wave One machine and interpret its results.”

If you are based in Seattle then you want to keep an eye out for when Paul Pham next teaches the Quantum Computing for Beginners course that follows the exciting narrative outlined above.

For everybody else, there is EdX‘s CS191x Quantum Mechanics and Quantum Computation course.  I very much hope this course will a be a regular offering.  Although it lacks the unique dramatic arche of P.Pham’s story line this course is nevertheless thoroughly enjoyable.

When I signed up for this course, I didn’t know what to expect.  Mostly, I decided to check it out because I was curious to see how the subject would be taught, and because I wanted to experience how well a web-based platform could support academic teaching.

This course fell during an extremely busy time, not only because of a large professional work load, but also because the deteriorating health of my father required me to fly twice from Toronto to my parents in Germany.  Despite this, the time required for this course proved to be entirely manageable.  If you have an advanced degree in math, physics or engineering, and want to learn about Quantum Computing, you shouldn’t shy away from taking this course as long as you have an hour to spare each week.  It helps that you can accelerate the video lectures to 1 1/2 normal speed (although this made Prof. Umesh Vazirani sound a bit like he inhaled helium).

Prof. Vazirani is a very competent presenter, and you can tell that a lot of thought went into how to approach the subject, i.e. how to ease into the strangeness of Quantum Mechanics for those who are new to it. I was suspicious of the claim made at the outset, that the required mathematics would be introduced and developed as needed during the course, but it seems to me that this was executed quite well. (Having been already familiar with the required math, I don’t really know if it’ll work for somebody completely new to it, but it seems to me that indeed the only pre-requisite required was a familiarity with linear algebra).

It is interesting to see discussions posted by individuals who took the course and were apparently subjected to QM for the first time.  One such thread started this way:

“I got 100. It was really a fun. Did I understand anything? I would say I understood nothing.”

To me this illuminates the fact that you simply cannot avoid the discussion of the interpretation of quantum mechanics.  Obviously this subject is still very contentious, and Prof. Vazirani touched on it when discussing the Bell inequalities in a very concise and easy to understand manner.  Yet, I think judging from the confusion of these ‘straight A’ students there needs to be more of it.  It is not enough to assert that Einstein probably would have reconsidered his stance if he knew about these results.  Yes, he would have given up on a conventional local hidden variable approach, but I am quite certain his preference would have then shifted to finding a topological non-local field theory.

Of course, there is only so much that can be covered given the course’s duration. Other aspects there were missing: Quantum Error Correction, Topological and Adiabatic Quantum Computing and especially Quantum Annealing.  The latter was probably the most glaring omission, since this is the only technology in this space that is already commercially available.

Generally, I found that everything that was covered, was covered very well.  For instance, if you ever wondered how exactly Grover’s and Shor’s algorithms work, you will have learned this after taking the course. I especially found the homework assignments wonderful brain teasers that helped me take my mind off of more worrisome issues at hand.  I think I will miss them. They were comprised of well thought out exercises, and as with any science course, it is really the exercises that help you understand and learn the material.

On the other hand, the assignments and exams also highlighted the strengths and weaknesses of the technology underlying the courseware.  Generally, entering formulas worked fine, but sometimes the solver was acting up and it wasn’t always entirely clear why (i.e. how many digits were required when giving a numerical answer, or certain algebraically equivalent terms were not recognized properly).  While this presented the occasional obstacle, on the upside you get the immediate gratification of instance feedback and a very nice progress tracking that allows you to see exactly how you are doing. The following is a screenshot of my final tally. The final fell during a week in which I was especially hard pressed for time, and so I slacked off, just guesstimating the last couple of answers (with mixed results).  In comparison to a conventional class, knowing exactly when you have already achieved a passing score via the tracking graph makes this a risk- and stress-free strategy.

Screen Shot 2013-04-27 at 11.56.31 AMA common criticism of online learning in comparison to the established ways of doing things is the missing classroom experience and interaction with the professor and teaching staff.  To counter this, discussion boards were linked to all assignments, and discussion of the taught material was encouraged.  Unfortunately, since my time was at a premium I couldn’t participate as much as I would have liked, but I was positively surprised with how responsive the teaching assistants answered questions that were put to them (even over the weekends).

This is all the more impressive given the numbers of students that were enrolled in this course:

The geographic reach was no less impressive:

Having being sceptical going into this, I’ve since become a convert.  Just as Khan Academy is revolutionizing the K12 education, EdX and similar platforms like Cousera represent the future for academic teaching.

 

My Fringe Science Problem

Updated below.

Cold Fusion. It should have been that simple.

It is past time to come clean. I have an addiction problem. It is said that the best way to confront this is to admit it: No matter how hard I try, I cannot stop paying attention to fringe science.

Of course I could blame early childhood experiences. Back when I was a teen, about three decades ago, the free energy crowd was already quite active, “Tachyon energy” was then their favorite pseudo science to justify how their fantastic contraptions could work. One of the free energy bulletins that I read credited an electrical engineer who just happened to be a very distant relative of mine. So I managed to get in touch with him. He was indeed engaging in some far-fetched research, but it had nothing to do with free energy, and he had never heard of the people who misrepresented him (at the time, he was researching if common radar microwave radiation played any role in the forest die-off that was widely attributed to acid rain, and to that end built some strange test machinery).

This is a pattern that I’ve seen repeated many times since then. The approach generally seems to follow these steps:

  1. Find some far fetched fringe research.
  2. Claim some tremendous break-through and purport that just a little bit of additional research will result in a fantastic product.
  3. Based on this, collect investment money and then retire early after the R&D effort unfortunately doesn’t work out.

The latest fringe science to receive this treatment is the text book example for pathological science, cold fusion. It has since been rebranded LENR for Low Energy Nuclear Reaction (it also goes by some other acronyms, but this seems to be the most common one).

One story that fits this pattern perfectly is that of the marvellous E-Cat (short for Energy Catalyzer). It sprang on the scene about two years ago with a tantalizing, if not independently supervised, demonstration that was convincing enough to bamboozle two Swedish physics profs into proclaiming that no chemical energy source could have the necessary energy density to produce the observed heat (conversion of water to steam). Over time this generated some mainstream news stories, and a bunch of blogs and forums sprung up to follow this story. One such blog followed an interesting trajectory: Paul Story, the maintainer of ecatnews.com, started out quite optimistic on this device, even banned some critical voices from the comment section of his blog. But then he was approached by the Australian billionaire Dick Smith who offered a prize of $1 million to anyone who could prove a usable 1KW LENR device. Nobody came forward to claim the money, although several commercial entities claimed to have just such prototypes. But this changed the tone at ecatnews.com and made it one of the few uncensored places where adherents and sceptics of this field could discuss (sometimes raucously) without the fear of being censored.

But Paul closed shop after he came to the conclusion that the E-Cat is just a scam. And this is where my addiction problem comes in. His blog was where I got my daily LENR dose, and the other blogs that still cover this story are by far less open and critical to be an adequate substitute. So in order to manage my addiction I have created a sub-blog, called Wavewatching Fringe Edition. This new addition is by no means supposed to take away the focus of this main blog, but rather help to manage my fringe science problem, and possibly serve as a resource to warn people to double check before investing in fringe science projects.

Be warned though, fringe science is addictive, it offers stories taller and more imaginative than any soap opera. If you want to stay clean, stay clear of the fringe.

Update

After losing a FB “Like” I feel like clarifying what I classify as “fringe science”.  To have an objective criteria I lump everything into this category that doesn’t flow from efforts published in reputable peer reviewed journals (creating new journals in order to get published doesn’t qualify). Since everything performed by humans is far from infallible, peer review can miss interesting things, but the signal to noise ratio in the fringe category will be much higher.

Similar as with my “Lost Papers” section I will try to focus on aspects that maybe shouldn’t be overlooked. But there is also the additional aspect that I focused on above. Old Hilbert papers make a very bad basis to solicite investments funds, on the other hand many of the hotter fringe science topics virtually spawn their own industries (that usually go nowhere).  If somebody researches these topics because they’ve been approached for investment funds then I hope the fringe section will paint a critical and realistic picture.

Of course it’ll be great if something as controversial as LENR could get to the point were repeatable, scalable experiments with a proper theoretical under-pinning brings it back to physics’ forefront.  Some LENR proponents feverishly believe that this is already the case.  Obviously I disagree, but I am not entirely ruling out that it could happen.

 

 

The Wave Particle Duality – A Deadly Divide

Wave_particle_duality_p_unknown
A particle and its associated wave function.

The curious fact that matter can exhibit wave-like properties (or should this rather be waves acting like particles?) is now referred to as the wave particle duality.  In old times it was often believed that there was some magic in giving something a name, and that it will take some of the christened’s power. Here is to hoping that there may be some truth to this, as this obvious incompatibility has claimed at least one prominent life.

It was Einstein who first made this two-faced character of matter explicit when publishing on the photo electric effect, assigning particle-like characteristics to light that up to this point was firmly understood to be an electromagnetic wave phenomenon.

But just like the question of the true nature of reality, the source of this dychotomy is almost as old as science itself, and arguably already inherent in the very idea of atomism as the other extrem of an all encompassing holism. The latter is often regarded as the philosophical consequence of Schroedinger’s wave mechanics, since a wave phenomenon has no sharp and clear boundaries, and in this sense is often presented as connecting the entirety of the material world. Taken to the extreme, this holistic view finds its perfect expression in Everett’s universal wavefunction (an interpretation happily  embraced by Quantum Hippies of all ages) which gave rise to the now quite popular many worlds interpretation of quantum mechanics.

While atomism proved to be extremely fruitful in the development of physics, it was never popular with religious authorities.  You can find echoes of this to this day if you look up this term at the Catholic Encyclopaedia:

Scholastic philosophy finds nothing in the scientific theory of atomism which it cannot harmonize with its principles, though it must reject the mechanical explanation, often proposed in the name of science, …

Or at this site of religious physicists:

Atomism is incompatible with Judeo-Christian principles because atomism views matter as independent of God, …

Religion of course really doesn’t have a choice in the matter as it can hardly maintain doctrine without some holistic principle.  It is no coincidence that physics only progressed after the cultural revolution of the Renaissance loosened the church’s dominance over the sheeple’s  minds. But history never moves in a straight line.  For instance, with Romanticism the pendulum swung back with a vengeance. It was at the height of this period that Ludwig Boltzmann achieved the greatest scientific breakthrough of atomism when developing statistical mechanics as the proper foundation of thermodynamics. It was not received well. With James Clerk Maxwell having seemingly established a holistic ether that explained all radiation as a wave phenomenon, atomism had thoroughly fallen out of favour.  Boltzmann vigorously defended his work and was no stranger to polemic exchanges to make his point, yet he was beset by clinical depression and feared in the end that his life’s work was for naught. He committed suicide while on a summer retreat that was supposed to help his ailing health.

He must have missed the significance of Einstein’s publication on Brownian Motion just a year early.  It is the least famous of his Annus Mirabelis papers, but it lay the foundation for experimentalists to once and for all settle the debate in Boltzmann’s favor, just a few years after his tragic death.

Thermodynamics made no sense to me before I learned statistical mechanics, and it is befitting that his most elegant equation for the entropy of a system graces the memorial at his grave site (the k denoting the Boltzmann constant).

A physicist can't ask for more to be remembered by than his most fundamental equation.
Ludwig Boltzmann Tombstone in Vienna.

Quantum Computing Hype Cycle and a Bet of my Own

box in troubleThe year 2013 started turbulent for the quantum computing field with a valiant effort by long time skeptic and distinguished experimentalist Michel  I. Dyakonov  to relegate it to the status of a pathological science akin to cold fusion (he does not use the term in his paper but later stated: “The terms ‘failed, pathological’ are not mine, but the general sense is correct.”).

Scott Aaranson took on this paper in his unique style (it’s a long read but well worth it). There really isn’t much to add to his arguments, but there is another angle that intrigues my inner “armchair psychologist”:  What exactly is it about this field that so provokes some physicists?  Is it that …

  • … Computer Scientists of all people are committing Quantum Mechanics?
  • … these pesky IT nerds have the audacity to actually take the axioms of Quantum Mechanics so seriously as to regard them as a resource for computational engineering?
  • … this rabble band of academics are breeding papers at a rabbit’s pace, so that no one can possibly keep up and read them all?
  • … quantum information science turned the ERP paradox on its head and transformed it into something potentially very useful?
  • … this novel science sucks up all sorts of grant money?

The answer is probably all of the above, to some extent.  But this still doesn’t feel quite right.  It seems to me the animosity goes deeper.  Fortunately, Kingsley Jones (whom I greatly admire) blogged about similar sentiments, but he is much more clear eyed on what is causing them.

It seems to me that the crux of this discomfort stems from the fact that many physicists have a long harbored discomfort with Quantum Mechanic’s intractabilities, which were plastered over with the Copenhagen Interpretation (which caused all sorts of unintended side effects).  It’s really a misnomer, it should have been called the ostrich interpretation, as its mantra was to ignore the inconstancies and to just shut-up and calculate. It is the distinct merit of Quantum Information science to have dragged this skeleton out of the closet and made it dance.

The quantum information scientists are agnostic on the various interpretations, and even joke about it.  Obviously, if you believe there is a truth to be found, there can be only one, but you first need to acknowledge the cognitive dissonance if there’s to be any chance of making progress on this front. (My favorite QM interpretation has been suggested by Ulrich Mohrhoff, and I have yet to find the inspiration to blog about this in an manner that does it justice – ironically, where he thinks of it as an endpoint, I regard it as allowing for a fresh start).

Meanwhile, in the here and now, the first commercial quantum computing device the D‑Wave One has to overcome its own challenges (or being relegated to a computing curiosity akin to analog neural VLSI).  2013 will be the year to prove its merits in comparison to conventional hardware. I’ve been in touch with a distinguished academic in the field (not Scott A.) who is convinced that optimization on a single conventional CPU will always outperform the D-Wave machines – even on the next generation chip. So I proposed a bet, albeit not a monetary one: I will gladly ship a gallon of Maple sirup to him if he is proven right and our dark horse Canadian trail blazer don’t make the finish line. The results should be unambiguous and will be based on published research, but just in case, if there should be any disagreement we will settle on Scott Aaronson as a mutually acceptable arbiter.  Scott is blissfully unaware of this, but as he is also the betting kind (the really big ones), I hope he’d be so kind as to help us sort this out if need be. After all, I figure, he will be following the D-Wave performance tests and at that time will already have formed an informed opinion on the matter.

The year 2013 started off with plenty of QIS drama and may very well turn out to be the crucial one to determine whether the field has crossed the rubicon.   It’s going to be a fun one.