Imagine a world before the advent of the steam engine that nevertheless imminently anticipates this marvelous machine’s arrival. Although no locomotive has been built, civil engineers are already busy discussing how to build rail-road bridges, architects try to determine the optimal layout of train stations, and the logistics of scheduling and maintaining passenger and freight traffic over the same tracks is heavily researched.

To some extent this seemingly absurd scenario is playing out in the world of quantum computing. For instance, take a look at this intriguing presentation by Rodney Van Meter:

While watching it I had to pinch myself a couple of times to make sure I wasn’t just hallucinating a beamed broadcast from the future. In fact it is more two years old. All this impressive infrastructure work is being performed while we are still years away from an actual scalable universal quantum computer.

Of course there is ample reason for all this activity, as has been documented on this humble blog. To recap: As our conventional computing inevitably arrives at structure sizes where undesired quantum effects can no longer be ignored. On the other hand harnessing the peculiarities of quantum mechanics will supercharge Moor’s law. It will enable us to tackle problems that are too complex for conventional computing.

Specialized quantum computing devices such as D-Wave’s machine or NIST’s impressive ion based quantum simulator already allow us a glance at the potential that this new approach to computing will unleash (btw. the NIST article makes it sound as if a “crystal” was contained in the Penning trap. This of course is nonsense. What is meant is that the ions are arranged in a 2d crystal like grid).

It is encouraging that this core technology is so feverishly anticipated and that considerable efforts to lay the groundwork for it are in progress. After all, conventional programming techniques won’t cut it if the goal is to leverage the additional power of a quantum computer. It will be key to empower software engineers to program these devices without forcing them to go through a quantum mechanics boot camp.

When picking up a textbook on the subject, the reader will very quickly be confronted with diagrams typically following the circuit model, where every line corresponds to a qbit. Such as:

While this is useful to introduce a reader to the peculiarities of entanglements and how this can be leveraged as a computational resource, it is obviously of limited use once you have a meaningful device that offers hundreds of qbits. Even for a dedicated (Ising model solving) system such as D-Wave, you can no longer draw a complete graph (although it helps to introduce a matrix notation to the uninitiated).

A purist might stop there and observe that quantum computation just means working with density matrices, and hence brushing up on your linear algebra is what it takes. The conventional programming analog would be to observe that Boolean logic is all you need to program a conventional chip. Obviously, higher levels of abstraction serve us well in this area.

The current state of affairs in quantum computing remind me of the early days of visual programming research long before the advent of UML to provide a unified framework.

For instance, there is Robert Tucci’s remarkable work to extend Bayesian Network diagrams into the quantum computing realm. There are also considerable efforts underway to develop a universal visual Tensor Network “language”. Last but not least, there are some convincing arguments that topological quantum computers are most amenable to a schema dubbed “quantum picturalism“. A nice talk on this is also available online (courtesy of Microsoft’s research division).

As this industry matures, expect a similar process as that which played out in the old world of visual programming. There is one important twist, though: Although UML is an excellent way to approach coding in a structured way (one that actually deserves to be called engineering), its adoption is lackluster, and sloppy coding still rules supreme.

To the extent that pictoral languages are at the heart of quantum computing programming, maybe another beneficial side effect of the coming quantum computing age will be to accelerate the maturing of the computer industry’s approaches towards software development.

Bayesian Networks are better than social networks. We will crush Facebook.

Bayesian Networks can facilitate statistically valid deductions, social graphs by defintion perform pretty much the exact opposit (follow the herd, a thousand flies cannot possibly be wrong … that kind of thing).

It’s like the age old question if sanity can win out against peer pressure.

Yes. Social networks provide data and Bayesian networks can be used to analyze data. Google is currently the biggest player in Bayesian networks (and other statistical techniques) and Facebook is currently the biggest player in social networks. I think eventually both companies will try to achieve proficiency in both types of networks. Of course, Spiderman is the best at all types of networks. He’s only afraid of quantum computing man.