Category Archives: Quantum Computing

Where Buzzwords Go to Die

It is a pretty sure sign that a buzzword is near the end of its life cycle when the academic world uses it for promotional purposes. Ever more science research comes with its own version of marketing hype.  What makes this such a sad affair, is that this is usually done pretty badly.

So why is spouting that quantum computing makes for perfect cloud computing really, really bad marketing?

“Cloud computing” is the latest buzzword iteration of “computing as a service”, and as far as buzzwords  go it served its purpose well.  It is still in wide circulation but the time is nigh that it will be put out to pasture, and replaced with something that sounds more shiny – while signifying the very same thing.

Quantum computing on the other hand is not a buzzword. It is a revolution in the making. To hitch it to the transitory cloud computing term is bad marketing in its own right, but the way that it is done in this case, is ever more damaging.  There is already one class of quantum information devices commercially available: Quantum Key Distribution systems. They are almost tailor-made to secure current Cloud infrastructures and alleviate the security concerns that  are holding this business model back (especially in Europe).

But you’d never know from reading the sorry news stories about the (otherwise quite remarkable) experiment to demonstrate blind quantum computing.  To the contrary, an uniformed reader will come away with the impression that you won’t have acceptable privacy in the cloud unless full-scale quantum computing becomes a reality.

Compare and contrast to this exquisite quantum computing marketing stunt. While the latter brings attention and confidence to the field at zero cost, this bought and paid for marketing couldn’t be further of the mark. It is almost like it’s designed to hold the entire industry back.  Simply pitiful.

Quantum Computing Micro Poll

Although the data basis is extremely small I think the results from this poll may still be instructive because I only advertised it within the LinkedIn Quantum Information Science Group.  I feel reasonably confident that the two dozen individuals of the roughly 1000 members of this group who bothered to vote are pretty well-informed on the subject matter.  The results indicate that the race is still wide open when asked what technology will first allow for more than a 100 quantum gates:

Click on the image to go to the live poll

Another little fun fact (although not statistically significant) is to compare the average age of the voter demographics:  The classic way of quantum realization (trapped ions) also has the highest average voter age at 37.5 years, while the youngest average age is recorded for the photonic approach at 29 years.

Unfortunately LinkedIn polls only allow for five choices. So I had to pick what I think are the front-runners.  Would love to learn what QC realizations the three votes for “something else” are referring to.

Quantum Computing Bounty

If you can hunt down this dog cat and kill Quantum Computing for good than there will be a mighty big reward waiting for you.  Scott Aaronson put up a bounty of $100,000.  All you have to do is prove that universal Quantum Computing is impossible in the real world.

Cartoon drawing by Becky Thorn

On the surface there are a couple of surprises here:  Scott doesn’t really hate quantum computing – he’s actually basing his academic career on it. And herein lies the rub for the even bigger surprise: This academic really knows how to create one heck of a marketing stunt.  His blog was already flooded after slashdot reported on this and more media is now jumping on the bandwagon.  This is an awful lot of free publicity for a marketing budget of exactly zero dimes (and this cost includes the net present value of the bounty money).

Hat’s off!

Dust to dust – Science for Science

No, this is not an obituary for D-Wave.

But the reporting of the latest news connected to D-Wave just doesn’t sit well with me.

Ever tried to strike up a conversation about Ramsey numbers around the water cooler, or just before a business meeting started? No? I wouldn’t think so.

I don’t mean to denigrate the scientific feat of calculating Ramsey numbers on D-Wave’s machine, but the way this news is reported, is entirely science for science’s sake.

It puts D-Wave squarely into the ghetto of specialized scientific computation. Although, I am convinced that quantum computing will be fantastic for science, and having a physics background I am quite excited about this, I nevertheless strongly believe that this is not a big enough market for D-Wave.

It is one thing to point to the calculation of numbers that less than one out of ten CIOs will ever have heard of. It is another matter entirely not to milk this achievement for every drop of marketing value.

In all the news articles I perused, it is simply stated that calculating Ramsey numbers is notoriously difficult. What this exactly means is left to the reader’s imagination.

If your goal is to establish that you are making an entirely new type of super-computer then you need an actual comparison or benchmark. From Wikipedia we can learn the formula for how many graphs have to be searched to determine a Ramsey number.

For R(8,2) D-Wave’s machine required 270 milliseconds. This comes to more than 68,719 million search operations. For a conventional computer one graph search will take multiple operations – depending on the size of the graph. (The largest graph will be 8 nodes requiring about 1277 operations).  Assuming the graph complexity grows with O(2n) I estimate about 800 operations on average.

Putting this together – assuming I calculated this correctly – the D-Wave machine performs at the equivalent of about 55 million MIPS.   For comparison: This is more than what a cluster of 300 Intel i7 Hex core CPUs could deliver.

Certainly some serious computational clout. But why do I have to waste my spare time puzzling this out?  At the time of writing I cannot find a press release about this on the company’s web site. Why? This needs to be translated into something that your average CIO can comprehend and then shouted from the rooftops.

D-Wave used to be good at performing marketing stunts and the company was harshly criticized for this from some academic quarters. Did these critics finally get under D-Wave’s skin?

…. I hope not.

Update: Courtesy of Geordie Rose from D-Wave (lifted from the comment section) here is a link to a very informative presentation on the Ramsey number paper.  While you’re at it you may also want to check out his talk. That one definitely makes for better water cooler  conversation material – less steeped in technicalities but with lots of apples and Netflix thrown in for good measure. Neat stuff.

Update 2: More videos from the same event now available on D-Wave’s blog.

The story patterns of your life and what does this have to do with quantum computing?

Storytelling is an integral part in how we usually organize our world. How did you chose your line of work? How did you meet the love of your life, or if you haven’t yet, why not?  There will be a story in the answers to these questions.

In our mind we constantly write on the story arc of our life. But there is a different view of the world that is no less valid. Once I read the story of Goldilocks to a child who was diagnosed with a mild form of autism spectrum disorder. When asked to summarize the story, the child answered:  “There was a pattern that repeated three times, then the bears came home.”

A brain like this is tuned to see patterns where other kids just perceive the story.

We humans like to share our stories and by doing this we make the patterns of our lives accessible.

A whole industry sprang from this, with its poster boy Facebook valued in the billions. The money of course is not in the stories but in the exposed patterns.  Tantalizingly, an unparalleled depth of market research seems to be in reach. Corporations in the B2C space seem to be at the cusp of achieving a completely transparent view of their customers.

There are only two obstacles: Customer’s concern for privacy and the sheer amount of data. Both bode well for quantum computing, and create opportunities for a customer driven market that D-Wave can take advantage of.

Social networks currently thrive on sharing images.  Let’s consider a concrete example:  A report that breaks out over time how many teenagers are wearing Nike versus Adidas shoes on photos shared online – broken out by gender and ethnicity. This kind of data is worth a lot of money but not enough to hire a small army of market analysts to compile it manually.  On the other hand fast, advanced image recognition is a good fit for D-Wave and one of their demo cases.  If they are as good as they claim they are, this kind of customer intelligence will quickly amortize an investment in their hardware.

Given the price tag and size of D-Wave’s machine it is easy to jump to the conclusion that quantum computing is in the mainframe age, similar to the early days of information technology.  Clearly the quantum computing power will be skewed to the 1% of big corporations?  Not necessarily,  because the analogy is flawed: The early mainframes were introduced into a pre-Internet world.

Consider the following business model:  Don’t want photos of you posted online without your knowledge? How about a web-based service that tracks any photos that show your face?

If enough individuals were willing to subscribe to such a service, it could harness the power of D-Wave’s One system to sift through millions of photos a day.

The patterns in the plethora of image data that we leave online is just the tip of the iceberg.  Apps like Google Goggles will ensure yet another data explosion in this area (and in due time they will be always online). Additionally, I expect ever more health relevant data added to the pile. Eventually some truly life changing and enhancing patterns will be uncovered. If ever another deadly product like cigarettes gets introduced, one can quickly connect the dots and a cover-up becomes unfeasible.

We may not see a quantum computing eharmony.com matching you on an infinite number of dimensions any time soon.  But I am convinced that unbeknownst quantum computing will come to the masses sooner than generally anticipated.

Time to start investing in quantum computing software?

While the argument over whether D-Wave represents true quantum computing still rages the company moves on and opens a software developer portal.

Assuming D-Wave succeeds in the marketplace (and I wish them all the best) they will own the term “quantum computing” for the time being. It is very unlikely that another vendor will introduce another quantum computing device in the intermediate future. Being the ultimate first mover in this nascent market D-Wave has the unique chance to shape it to their advantage.

The (mostly academic) critics won’t cease to point out that D-Wave doesn’t implement a universal gate programmable quantum computer – not that the company ever claimed such a thing – but assuming that their device delivers superior computational power the market won’t care.

But there is a catch: D-Wave’s optimization engine is versatile enough to help with many real life optimization business cases but the path to actual usefulness is all in the software.

Let’s first look at the B2B market (I’ll discuss applications in the consumer space in a later blog entry).

It is hard to get market share numbers that break out spending on IT solutions for optimization software. Currently this is mostly folded into the BI and CRM category, both areas that pose plenty of business cases (e.g. channel, price, revenue, supply chain optimization to just name a few). According to Forester, BI and CRM are the main drivers of IT spending in 2012.

Even if D-Wave could just dip a toe into this huge market it’ll render it a commercial success.

But selling any optimization solution to businesses is no small feat. The potential customers don’t care about the underlying technology and won’t be interested in talking about hardware and algorithms.

Making a persuasive case is tricky because it requires a special skill mix: Domain expertise as well as understanding of the parameters within which an optimal solution can be found. While it suffices to mention this as “boundary condition” to those with a mathematical background, this term will result in blank stares with your average business crowd and is guaranteed to sink your sales pitch.

The magic realm of financial engineering may seem like a good fit for D-Wave. The quants who reside in this world will clearly understand where this company is coming from – many of these folks will have been lured away from a career in math and physics and there is no doubt that they could utilize a quantum optimization device to its fullest potential. That is, if this magic kingdom wasn’t on fire. Wall Street’s profits are up again, but on a shaky foundation, and the faith in the reliability of financial modeling has certainly taken quite a hit. (For a great account on this I recommend the hilariously named “Models behaving badly” book). This will certainly have reduced the appetite to gamble sizable amounts on – as of yet – untested technology (although sizable is of course rather relative when contrasted with the kind of volume that high frequency trading employs).

In the end, D-Wave’s best option may be to lower the barrier for engaging with their tech, i.e.  by developing a stack of accelerator software that allows programs written in SAS or R to execute without modification in a hosted environment. That means engineering a pre-processor that diverts the invocations of proc nlp or optim() to an underlying D-Wave parser that compiles the optimization problems for execution on their hardware.

“Build it and they will come” is rarely a winning business strategy, but if D-Wave can make it effortless enough to test drive their optimization engine, then they stand a good chance to competing on the merits of their technology.

When they finally came ...

Nascent industry killed from birthing pain?

Networkworld.com has an instructive article on the state of quantum computing. While the headline is somewhat sensationalist (not as much as mine though) the article turned out to be well written and solidly sourced. It does a good job of highlighting what D-Wave has on offer while contrasting it to the universal quantum computing approach.

Dr. Suzanne Gildert from D-Wave is quoted extensively and I think succeeded in conveying why their product is relevant. It is also instructive that she envisions quantum computing as a kind of co-processing unit for cloud service providers. This very much meshes with my own expectations of where we will see the first wave of the commercial application of quantum computers (no pun intended).

Where the article appears somewhat disjointed, is in not making clear that the presented argument against the rapid availability of quantum computing hardly applies to the approach that D-Wave is taking. Another testament to the fact that the waters have been considerably muddied as to what quantum computing actually means. Maybe D-Wave is better served to consistently call its method “Natural Quantum Computing ” as positioned by Dr. Gildert.

The gist of all pessimism voiced in the article is the ever-present struggle to preserve the coherence of qbits. While this is the obvious Achilles’ heel of any quantum system the critical voices quoted in the article seem oddly out of step with the current state of the art. The relatively new approach of cluster-state quantum computation significantly mitigates the decoherence problem. When it was first published in 2005 it prompted David Deutsch to shorten his expectation of the realization of the first universal quantum computer from many decades to within the decade.

These recent research trends in cluster/measure based quantum computation were nowhere reflected in the article whereas ultimate naysayer Artur Ekert from Oxford university was quoted as gleefully stating that “the best outcome of our research in this field would be to discover that we cannot build a quantum computer for some very fundamental reason, then maybe we would learn something new and something profound about the laws of nature.”

Makes you wonder if Ekert and his fellow faculty member Deutsch who co-authored a couple of older papers are still on the same page – or on speaking terms for that matter.

Update: I should have realized that IT journalism subscribes to the same “he said – she said” sort of reporting that makes mainstream media so useless in the US. Ekert’s quote is probably taken out of context just to provide some semblance of faux balance in Networkworld’s article.

You keep using that word. I do not think it means what you think it means.

UCSB physicists claim to have a von Neumann architecture implemented for quantum computing.

Far be it from me to denigrate what they achieved but I have to take issue with the marketing.

They keep using that “von Neumann” word. I do not think it means what they think it means. In a strict sense a quantum computer cannot possibly implement a von Neumann architecture. Peter Hines a theoretical computer science researcher wrote a nice paper about just that question not too long ago.

I think this an example of a somewhat careless appropriation of a term that has a well-defined meaning in computer science.

In my mind this is rather problematic, because sloppy adoption of classical CS terms does nothing to illuminate the fundamental difference of quantum computing and is destined to just add confusion.

Update:

The UCSB researchers talk about a “Quantum von Neumann” architecture in their original paper. The “Quantum” unfortunately got dropped in many news releases.

The purpose of this blog

Gartner underestimates an emerging technology and I will use this humble blog to redress this.  Other than Gartner I will do it for free because this new and developing industry is utterly fascinating.

Gartner places quantum computing firmly at the early stages of what they call the “Hype Cycle”. This is what they call a technology that is more than 10 years away from maturity.

When googling the latest publicly available report I found that even in 2009 Gartner didn’t revisit this assessment.

This seems like a reasonable and conservative classification when taking into account how most quantum computing is performed at this time. The leading approach requires a roomful of high tech lab equipment to create ultra-cold, single atom ions that are then trapped with uncanny precision and manipulated by lasers to create entangled electron states that represent qbits. At the time of writing the record stands at 14 qbits that could potentially be used to perform quantum algorithms.

An impressive scientific feat but hardly with the potential to impact business computing any time soon.

Yet, already in 2007 there was a small start-up company making some marketing noise: Claiming that they can deliver a pre-packaged quantum computer on a scale that won’t crowd your average corporate data center.

Initially, they kept their technology a black box. A demonstration in 2007 showed that they couldn’t convincingly outperform conventional computers. So Gartner’s stance back then is quite justifiable. They weren’t alone in dismissing this as vaporware.

But things have changed. Now the company D-Wave is selling their (big) black box and also found a first paying customer in Lockheed Martin.

The controversy if this is real quantum computing hasn’t gone away, but even one of D-Wave’s fiercest critics had to admit that the company finally demonstrated the use of quantum annealing on eight qbits when publishing some of their results in Nature.

For the currently shipping system they claim 128 qbits. If these could actually all be used as an entangled entity then this new computational device would have some serious clout.

As my time allows I will use this blog to cover this emerging technology and strive to make it a useful resource that helps to distinguish between the hype and reality of quantum computing. I will delve into what makes this type of information processing so different from coding for good old fashioned Turing machines and explore the difference between the “conventional” quantum computing approach and what D-Wave is offering.

Some of my blog entries will be technical and some lighter reading for the more business minded crowd and I will mark them accordingly. In the end I hope this blog will prove to be a useful resource for any professional who may be confounded by the question of what quantum computing is and what it’s good for.

Update:

D-Wave recent paper in Nature makes clear that they don’t use entangled qbit states. Rather any extra power over a classic analog computing annealing scheme will stem from quantum tunneling.