All posts by Henning Dekant

Analog VLSI for Neural Networks – A Cautious Tale for Adiabatic Quantum Computing

Update:  This research is now again generating some mainstream headlines.  Will be interesting to see if this hybrid chip paradigm will have more staying power than previous analog computing approaches.

###

Fifteen years ago I attempted to find an efficient randomized training algorithm for simple artificial neural networks suitable for implementation on a specialized hardware chip. The latter’s design only allowed feed-forward connections i.e. back-propagation on the chip was not an option.  The idea was that given the massive acceleration of the networks execution on the chip some sort of random walk search might be at least as efficient as optimized backprop algorithms on general purpose computers.

My research group followed a fairly conventional digital design, whereas at the time, analog VLSI was all the rage.  A field (like so many others) pioneered by Carver Mead. On the face of it, this makes sense, given that the biological neurons obviously work with analog signals, but nevertheless attain remarkable robustness (the latter being the typical problem with any sort of analog computing). Yet, it is also this robustness that makes the “infinite” precision that is the primary benefit of analog computing somewhat superfluous.

Looking back at this I expected this analog VLSI approach to be a bit of of an engineering fad as I wasn’t aware of any commercial systems ever hitting the market – of course I could have easily missed a commercial offering  if it followed a similar trajectory as the inspiring but ultimately ill fated transputer. In the end the later was just as much a fad as the Furby toy of yesteryear yet arguably much more inspiring.

To my surprise and ultimate delight a quick statistic on the publication count for analog neural VLSI proves me wrong and there is still some interesting science happening:

Google Scholar Publication Count for Neural Analog VLSI

So why are there no widespread commercial neuromorphic products on the market? Where is the neural analog VLSI co-processor to make my Laptop more empathic and fault tolerant? I think the answer comes simply down to Moor’s law.  A flagship neuromorphic chip currently designed at MIT boasts a measly 400 transistors.  I don’t want to dispute its scientific usefulness – having a detailed synapse model in silicon will certainly have its uses in medical research (and the Human Society will surely approve if it cuts down on the demise of guinea pigs and other critters). On the other hand the blue brain project claims it already successfully simulated an entire rat cortical column on their supercomputer and their goal is nothing less than a complete simulation of the rodents brain.

So what does this have to do with Adiabatic Quantum Computing?  Just like in the case of neuromorphic VLSI technology, its main competition for the foreseeable future is conventional hardware. This is the reason why I was badgering D-Wave when I thought the company didn’t make enough marketing noise about the Ramsey number research performed with their machine.  Analog Neural VLSI technology may find a niche in medical applications, but so far there is no obvious market niche for adiabatic quantum computing. Scott Aaranson argued that the “coolness” of quantum computing will sell machines.  While this label has some marketing value, not the least due to some of his inspired stunts, this alone will not do.  In the end, adiabatic quantum computing has to prove its mettle in raw computational performance per dollar spent.

(h/t to Thomas Edwards who posted a comment a while back in the LinkedIn Quantum Information Science that inspired me to this post)

Keep Walking – No Quantum Computing Jobs on Offer

UPDATE: Ever so often I get a search engine click on this post, presumably from QIS job seekers. So despite the dreary title I now can give you a link to this job portal.

When promoting my last blog post in the LinkedIn “Quantum Computing and Quantum Information” group a strange thing happened.  Rather than staying in the main discussions section of that group my item was moved to the jobs section. So let’s be clear about this: I don’t have jobs to offer.

My last entry at this LinkedIn group is now the only one amidst a see of white fertile untouched HTML background in the jobs section.  Quite a contrast to the High Performance Computing LinkedIn groups that have dozens of jobs posted.

This made me ponder the Quantum Information Science (QIS) job market.  While I expect in the not so distant future there will be plenty of high qualified jobs to be had, we are certainly living in a very different reality as of now.  There is a reason that the byline to this blog is “Observations on the nascent quantum computing industry”.

With the exception of D-Wave the few other commercial players (i.e. the IBMs of the world) are still in basic research mode. Outside these companies, and a very small Quantum Key Distribution sector, there are essentially no QIS jobs in the private sector. It seems to me the job market still looks very much the same as described by Robert Tucci 1 ½ years ago.

Don’t peruse this blog because you are looking for a job in this area. But by all means please come here, and check back regularly, if you are intrinsically interested in Quantum Computing.

Update: Todd Brun was so kind to point out his QIS job site as a worthwhile destination for the wary job seeker who surfed wandered over here looking for QC related work.

A Brief History of Quantum Computing

An attempted apotheosis.

In the beginning there was Feynman.  And Feynman was with the scientific establishment.  Feynman was a god. Or at least as close to it as a mortal physicist can ever be. He tamed the light with his QED (pesky renormalization non-withstanding) and gave the lesser folks the Feynman diagrams, so that they could trade pictograms like baseball cards and feel that they partook.  But he saw that not all was good. So Feynman said: “Quantum mechanical systems cannot be efficiently simulated on conventional computing hardware.” Of course he did that using quite a few more words and a bit more math and published it in a first class journal as was befitting.

And then he was entirely ignored.

Feynman walked upon earth.

Four grueling years went by without follow-up publications on quantum computing. That is until David Deutsch got in on the act and laid the foundation for the entire field with his seminal paper.  His approach was motivated by how quantum physics might affect information processing, including the one that happens in your human mind.  So how to experiment on this? Obviously you cannot put a real brain into a controlled state of quantum superposition (i.e. a kind of “Schrödinger’s brain) – but a computer on the other hand won’t feel any pain.

Let's rather put a computer into Schrödinger's box.
Less messy.

So was Feynman finally vindicated? No, not really, Deutsch wrote:

Although [Feynman’s ‘universal quantum simulator’] can surely simulate any system with a finite-dimensional state space (I do not understand why Feynman doubts that it can simulate fermion systems), it is not a computing machine in the sense of this article.

The old god just couldn’t get any respect anymore. His discontent with string theory didn’t help, and in addition he experienced another inconvenience: He was dying. Way too young.  His brilliant mind extinguished at just age 69.  Like most people, he did not relish the experience, his last words famously reported as: “I’d hate to die twice. It’s so boring.”

The only upside to his death was that he didn’t have to suffer through Penrose’s book “The Emperor’s New Mind” that was released the following year.  While it is a great read for a layperson who wants to learn about the thermodynamics of black holes as well as bathroom tile designs, none of the physics would have been news to Feynman. If he wasn’t already dead, Feynman probably would have died of boredom before he made it to Penrose’s anti-climactic last chapter.  There, Penrose finally puts forward his main thesis, which can be simply distilled as “the human mind is just different”.  This insight comes after Penrose’s short paragraph on quantum computing, where the author concludes that his mind is just too beautiful to be a quantum computer, although the latter clearly can solve some NP complete (h/t Joshua Holden) problems in polynomial time. Not good enough for him.

Physics, meanwhile, has been shown to be a NP hard sport, but more importantly for the advancement of quantum computing was the cracking of another NP class problem; Shor’s algorithm showed that a quantum computer could factorize large numbers in polynomial time.  Now the latter might sound about as exciting as the recent Ramsey number discussion on this blog, but governments certainly paid attention to this news, as all our strong encryption algorithms that are currently in wide use can be cracked if you accomplish this feat. Of course quantum information science offers a remedy for this as well in the form of (already commercially available) quantum encryption, but I digress.

With the kind of attention that Shor’s algorithm engendered, the field exploded, not the least due to computer science taking an interest.  In fact Michael Nielsen, one of the co-authors of the leading textbook on the subject, is a computer scientist by trade and if you want a gentle introduction to the subject I can highly recommend his video lectures (just wish he’d get around to finishing them).

Of course if you were stuck in the wrong place at the time that all this exciting development took place, you would have never known.  My theoretical physics professors at the University of Bayreuth lived and died by the Copenhagen omertà that in their mind decreed that you could not possibly get any useful information out of the phase factor of a wave function. I was taught this as an undergraduate physics student in the early nineties (in fairness probably before Shor’s algorithm was published, but long after Deutsch’s paper). This nicely illustrates the inertia in the system and how long it takes before new scientific insights become wide-spread.

The fresh and unencumbered view-point that the computer scientists brought to quantum mechanics is more than welcome and has already helped immensely in softening up the long-established dogma of the Copenhagen interpretation. Nowadays Shor can easily quip (as he did on Scot Aaranson’s blog):

Interpretations of quantum mechanics, unlike Gods, are not jealous, and thus it is safe to believe in more than one at the same time. So if the many-worlds interpretation makes it easier to think about the research you’re doing in April, and the Copenhagen interpretation makes it easier to think about the research you’re doing in June, the Copenhagen interpretation is not going to smite you for praying to the many-worlds interpretation. At least I hope it won’t, because otherwise I’m in big trouble.

I think Feynman would approve.

Information is Physical

Even when the headlines are not gut-wrenching

Information processing is seldom that physical.

One of the most astounding theoretical predictions of the late 20th century was Landauer’s discovery that erasing memory is linked to entropy i.e. heat is produced whenever a bit is fully and irrevocably erased.  As far as theoretical work goes this is even somewhat intuitively understandable: After all increasing entropy essentially means moving to a less ordered phase state (technically a micro-ensemble that is less special). And what could be possibly be more ordered than a computer memory register?

Recently this prediction has been confirmed by a very clever experiment.  Reason enough to celebrate this with another “blog memory-hole rescue”:

If you ever wondered what the term “adiabatic” in conjunction with quantum computing means, Perry Hooker provides the answer in this succinct explanation. His logic gate discussion shows why Landauer’s principle has implications far beyond the memory chips, and in a sense, undermines the entire foundation of classical information processing.

Truly required reading if you want to appreciate why quantum computing matters.

Quantum Computing for the Rest of Us? – UPDATED

<Go to update>

Everybody who is following the Quantum Computing story will have heard by now about IBM’s new chip. This certainly gives credence to the assumption that superconducting Josephon junction-based technology will win the race.  This may seem like bad news for everybody who was hoping to be able to tug away a quantum computer under his or her desk some day.

Commercial liquid cooling has come a long way, but shrinking it in price and size to fit into a PC tower form factor is a long way off. Compare and contrast this to liquid nitrogen cooling. With a high temp superconductor and some liquid nitrogen poured from a flask any high school  lab can demonstrate this neat Meisner effect:

Cooled with Liquid Nitrogen (a perfectly harmless fluid unless you get it on your cloths)

Good luck trying this with liquid helium (for one thing it may try to climb up the walls of your container – but that’s a different story).

Nitrogen cooling on the other hand is quite affordable.  (Cheap enough to make the extreme CPU overclocking scene quite fond of it).

So why then are IBM and D-Wave making their chips from the classic low temp superconductor Niobium?  In part the answer is simple pragmatic engineering: This metal allows you to adopt fabrication methods developed for silicon.

The more fundamental reason for the low temperature is to prevent decoherence, or to formulate it positively, to preserve pure entangled qubit states. The latter is especially critical for the IBM chip.

The interesting aspect about D-Wave’s more natural – but less universal – adiabatic quantum computing approach, is that a dedicated control of the entanglement of qubits is not necessary.

It would be quite instructive to know how the D-Wave chip performs as a function of temperature below Tc (~9K). If the performance doesn’t degrade too much, maybe, just maybe, high temperature superconductors are suitable for this design as well.  After all, it has been shown they can be used to realize Josephon junctions of high enough quality for SQUIDS. On the other hand, the new class of iron based superconductors show promise for easier manufacturing, but have a dramatically different energy band structure, so that at this point it seems all bets are off on this new material.

So, not all is bleak (even without invoking the vision of topological QC that deserves its own post). There is a sliver of hope for us quantum computing aficionados that even the superconducting approach might lead to an affordable machine one of these days.

If anybody with a more solid Solid States Physics background than me, wants to either feed the flames of this hope or dash it –  please make your mark in the comment section.

(h/t to Noel V. from the LinkedIn Quantum Computing Technology group and to Mahdi Rezaei for getting me thinking about this.)

Update:

Elena Tolkacheva from D-Wave was so kind to alert me to the fact that my key question has been answered in a paper that the company published in the summer of 2010. It contains the following graphs that illustrates how the chip performs at three different temperatures: (a) T = 20 mK, (b) T = 35 mK, and (c) T = 50 mK.  Note: These temperatures are far below Niobium’s critical temperature of 9.2 K.

The probability of the system to find the ground state (i.e. the “correct” answer) clearly degenerates with higher temperature – interestingly though not quite as badly as simulated. This puts a sever damper on my earlier expressed hope that this architecture might be suitable for HTCs as this happens so far away from Tc .

Kelly Loum, a member of the LinkeIn Quantum Physics group, helpfully pointed out that,

… you could gather results from many identical high temperature processors that were given the same input, and use a bit of statistical analysis and forward error correction to find the most likely output eigenstate of those that remained coherent long enough to complete the task.

… one problem here is that modern (classical, not quantum) FECs can fully correct errors only when the raw data has about a 1/50 bit error rate or better. So you’d need a LOT of processors (or runs) to integrate-out the noise down to a 1/50 BER, and then the statistical analysis on the classical side would be correspondingly massive. So my guess at the moment is you’d need liquid helium.

The paper also discusses some sampling approaches along those lines, but clearly there is a limited to how far they can be taken.  As much as I hate to admit it, I concur with Kelly’s conclusion.  Unless there is some unknown fundamental mechanism that’ll make the decoherence dynamics of HTCs fundamentally different, the D-Wave design does not seem a likely candidate for a nitrogen cooling temperature regime. On the other hand drastically changing the decoherence dynamics is the forte of topological computing, but that is a different story for a later post.

The Rise of the Quantum Hippies

… and why I blame Niels Bohr.

A satirical hyperbolic polemic

Recently there was a bit of a tempest in a teapot in the LinkedIn quantum physics group because it is very much over-run by members who I dub “Quantum Hippies”.  I.e. the kind of people who think they’ve read a quantum mechanics book after putting down Capra’s the Tao of Physics – you have probably encountered the type.

So this begs the question: Where did they spring from?

It certainly didn’t start with Capra, he was just a catalyst.

I blame this guy:

Niels Bohr stands accused.

If it wasn’t for him, and his side-kick Heisenberg, Bohr’s Copenhagen Interpretation would have never become the kind of dogma that it did.  We are still suffering the consequences.

Science is a competitive sport, even more so in the olden days when the myth of the lone genius reigned supreme.  Most of the founding fathers of quantum mechanics lacked many things but not ego. Much has been written about the struggle between Bohr and Einstein. The latter of course never stood a chance as he has been far removed from the development of the new theory. It didn’t help that he was old at the time and easily painted as a relic. Other challengers to the Copenhagen Interpretation were dealt with in various ways.

  • It was helpful that David Bohm could be vilified as a communist and nobody listened to de Broglie anyway.
  • Schrödinger mocked the new dogma with his famous cat in a box thought experiment but did not have it in him to put up a real fight.
  • Max Planck fell into the same geezer category as Einstein, but was even easier to dismiss due to his far less prominent name recognition.
  • Karl Popper was “just a philosopher”.
  • Others like Landé weren’t much of a challenge, falling into the “Landé who?” category.

Hence the Copenhagen Interpretation reigned supreme, and much energy was now devoted to keep its dirty little secret tucked away, in the closet, under the stairs with the key thrown away.

Maybe some of the energy directed at defending it against the other interpretations was in part motivated by the thought that it’ll be easier to keep this problematic aspect of the new theory under wraps. For whatever reason, Bohr and Heisenberg gave birth to a new physics omertà, the “shut-up and calculate” doctrine.  This would have far reaching consequences – way beyond the realm of physics.

The raison d’être of the hippie revolution was to challenge authority (that arguably was asking for it).

What a delightful gift Bohr had prepared for a counter-culture movement that was already high on half-understood Asian influenced mysticism and other more regulated substances. And so the Copenhagen Interpretation’s dirty little secret was dragged out of the closet and subsequently prostituted.  I am of course referring to the fact that the wave-collapse originally invented by Heisenberg requires an observer or observing mind. This was subsequently bastardized into the idea that “the conscious mind creates reality”. Just as Einstein’s Special and General Relativity entered popular culture as the spectacularly wrong premise that “everything is relative”,  cart blanche for magical thinking was brought to you courtesy of some of the greatest minds of the 20th century.  A more spectacular blow‑back is hard to imagine.

This was super-charged by Bell’s theorem that confirmed quantum mechanics’ essential non-locality.  This in turn was translated as the mystical certainty that “everything is instantaneously connected all the time”.  And so to this day you get spectacularly wrong pop science articles like this one. It completely ignores that these days entangled qbits (the essential ingredient in the thought experiment on which this article hinges) are very well understood as a quantum information resource, and that they cannot facilitate an instantaneous connection between distant events.  The term “instantaneous” has no absolute meaning when Special Relativity is taken into account. This is especially egregious when contemplating that this was published in the American Association of Physics Teacher’s journal.

Although it’s a well-established fact that the public American education system has hit rock bottom in the developed world I still would have expected better.

The Flower Power movement has been generally associated with the left political spectrum but it is in the nature of such powerful memes to eventually permeate the entire mainstream thinking.  Hence American journalists prescribe to a “he said she said” school of journalistic “objectivity”, after all everything’s relative, and so political operatives of all color feel fully justified in subscribing to a “Triumph of the Will” line of thinking.

When Ron Suskind interviewed inside staff from the Bush Jr. administration and questioned them as to how they thought they could just remake Iraq with the limited resources committed, the staffer famously answered: “… when we act, we create our own reality”.

Yes, I blame Niels Bohr for that too.

Rescued From the Blogroll Memory Hole

During the week my professional life leaves me no time to create original content.  Yet, there is a lot of excellent material out there pertinent to the nascent quantum information industry. So to fill the inter-week void I think it is very worthwhile to try to rescue recent blogroll posts from obscurity.

Very relevant to the surprise that Scott Aaronson came around on D-Wave is Robert Tucci’s great technical review of D-Wave’s recent Nature paper.  If you are not afraid of some math and are tired of the void verbiage that passes for popular science journalism than this is for you.

Scott Aaronson Visits D-Wave – No Casualties Reported

Those following the quantum computing story and D-Wave are well aware of the controversial scuffle between Scott and the company.  So it’s quite newsworthy that according to Scott’s own accord the hatchet has been buried. No more comparing the D‑Wave One to a roast beef sandwich (hopefully BLT is out of the picture too).

Scott is still taking on D-Wave’s pointy-haired bosses though.  He wants them to open the purse-strings to determine more clearly how “quantum” the Rainier chip really is.

Unfortunately, I think he overlooks that pointy-haired bosses are only interested in the bottom line.  At this point there is nothing stopping D-Wave from selling their system as a quantum computer. Not a universal one but who’s counting.  Any closer inquiry into the nature of their qbits only carries the danger to add qualifiers to this claim.  So why should they bother?

In terms of marketing, the label “Quantum Computer” is just a nice nifty term signifying to potential customers that this is a shiny, cool new device.  Something different.  It is supposed to serve as a door opener for sales.  Afterwards it just comes down to the price/performance ratio.

At this point D-Wave One sales won’t benefit from further clarifying how much entanglement is occurring in their system – I see this only change once there is actual competition in this space.

Update:  I finally don’t have to develop a cognitive dissonance about adding Scott’s and D‑Wave’s blog to my blogroll. So to celebrate this historic peace accord this is my next step in the development of this most important website.  BTW if you still need an argument why it is worthwhile to pay attention to Scott, treat yourself to his TedX talk (h/t to Perry Hooker).

Where Buzzwords Go to Die

It is a pretty sure sign that a buzzword is near the end of its life cycle when the academic world uses it for promotional purposes. Ever more science research comes with its own version of marketing hype.  What makes this such a sad affair, is that this is usually done pretty badly.

So why is spouting that quantum computing makes for perfect cloud computing really, really bad marketing?

“Cloud computing” is the latest buzzword iteration of “computing as a service”, and as far as buzzwords  go it served its purpose well.  It is still in wide circulation but the time is nigh that it will be put out to pasture, and replaced with something that sounds more shiny – while signifying the very same thing.

Quantum computing on the other hand is not a buzzword. It is a revolution in the making. To hitch it to the transitory cloud computing term is bad marketing in its own right, but the way that it is done in this case, is ever more damaging.  There is already one class of quantum information devices commercially available: Quantum Key Distribution systems. They are almost tailor-made to secure current Cloud infrastructures and alleviate the security concerns that  are holding this business model back (especially in Europe).

But you’d never know from reading the sorry news stories about the (otherwise quite remarkable) experiment to demonstrate blind quantum computing.  To the contrary, an uniformed reader will come away with the impression that you won’t have acceptable privacy in the cloud unless full-scale quantum computing becomes a reality.

Compare and contrast to this exquisite quantum computing marketing stunt. While the latter brings attention and confidence to the field at zero cost, this bought and paid for marketing couldn’t be further of the mark. It is almost like it’s designed to hold the entire industry back.  Simply pitiful.

Quantum Computing Micro Poll

Although the data basis is extremely small I think the results from this poll may still be instructive because I only advertised it within the LinkedIn Quantum Information Science Group.  I feel reasonably confident that the two dozen individuals of the roughly 1000 members of this group who bothered to vote are pretty well-informed on the subject matter.  The results indicate that the race is still wide open when asked what technology will first allow for more than a 100 quantum gates:

Click on the image to go to the live poll

Another little fun fact (although not statistically significant) is to compare the average age of the voter demographics:  The classic way of quantum realization (trapped ions) also has the highest average voter age at 37.5 years, while the youngest average age is recorded for the photonic approach at 29 years.

Unfortunately LinkedIn polls only allow for five choices. So I had to pick what I think are the front-runners.  Would love to learn what QC realizations the three votes for “something else” are referring to.