Observations on Quantum Computing & Physics http://wavewatching.net Fri, 03 Jun 2016 04:04:03 +0000 en-US hourly 1 http://wordpress.org/?v=4.1.12 Canadian PM Justin Trudeau talks Quantum Computing http://wavewatching.net/2016/04/15/canadian-pm-justin-trudeau-talkes-quantum-computing/ http://wavewatching.net/2016/04/15/canadian-pm-justin-trudeau-talkes-quantum-computing/#comments Sat, 16 Apr 2016 02:44:33 +0000 http://wavewatching.net/?p=3679 Continue reading Canadian PM Justin Trudeau talks Quantum Computing ]]> He is already fluently bi-lingual, but he also speaks pretty good Quantum.  This isn't half bad for a head of state:

If you also want to impress your friends like this, I recommend Sabine Hossenfelder's Quantum lingo crash course.

This bodes well for the prospects of seeing some federal initiatives for the emerging Canadian QC industry in the not too distant future.

 

]]>
http://wavewatching.net/2016/04/15/canadian-pm-justin-trudeau-talkes-quantum-computing/feed/ 5
Late Wave http://wavewatching.net/2016/03/07/late-wave/ http://wavewatching.net/2016/03/07/late-wave/#comments Mon, 07 Mar 2016 16:32:51 +0000 http://wavewatching.net/?p=3668 Continue reading Late Wave ]]> It took only one scientist to predict them but a thousand to get them confirmed (1004 to be precise). I guess if the confirmation of gravitational waves couldn't draw me out of my blogging hiatus nothing could, although I am obviously catching a very late wave. The advantage of this - I can compile and link to all the best content that has already been written on the topic.

Of course this latest spectacular confirmation will unfortunately not change the mind of those quixotic individuals who devote themselves to fight the "wrongness" of all of Einstein's work (I once had the misfortune of encountering the maker of this abysmal movie. Safe to say I had more meaningful conversations talking to Jehovah Witnesses).

But given the track record of science news journalism, what are the chances that this may be a fluke similar to the BICEP news that turned out to be far less solid than originally reported? Or another repeat of the faster than light neutrino measurements?

The beauty of a direct experimental measurement as performed by LIGO, is that the uncertainty can be calculated statistically. Since this is a “5-sigma” event, this means the signal is real with a 99.9999% probability. The graph at the bottom shows that what has been measured matches a theoretically expected signal from a black hole merger so closely that the similarity is immediately compelling even for a non-scientists.

But more importantly, unlike faster than light neutrinos, we have every reason to believe that gravitational waves exist. There is no new physics required, and the phenomenon is strictly classical, in the sense that General Relativity produces a classical field equation that unlike Quantum Mechanics adheres to physical realism. That is why this discovery does nothing to advance the search for a unification of gravity with the other three forces. The importance of this discovery lies somewhere else, but is no less profound. Sabine Hossenfelder says it best:

Hundreds of millions of years ago, a primitive form of life crawled out of the water on planet Earth and opened their eyes to see, for the first time, the light of the stars. Detecting gravitational waves is a momentous event just like this – it’s the first time we can receive signals that were previously entirely hidden from us, revealing an entirely new layer of reality.

The importance of this really can't be overstated.  The universe is a big place and we keep encountering mysterious observations. There is of course the enduring puzzle of dark matter, lesser known may be the fast radio bursts first observed in 2007 that are believed to be the highest energy events known to modern astronomy.  Until recently it was believed that some one-off cataclysmic events were the underlying cause, but all these theories had to be thrown out when it was recently observed that these signals can repeat.  (The Canadian researcher who published on this recently received the highest Canadian science award, and the CBC has a nice interview with her).

We are a long way off from having good spatial resolution with the current LIGO setup. The next logical step is of course to simply drastically increase the scale of the device, and when it comes to Laser interferometry this can be done on a much grander scale then with other experimental set-ups (e.g. accelerators).  The eLISA space based gravitational wave detector project is well underway. And I wouldn't yet count out advanced quantum interferometry as a means to drastically improve the achievable resolution, even if they couldn't beat LIGO to the punch.

After all, it was advanced interferometry that had been driving the hunt for gravitational waves for many decades. One of its pioneers, Heinz Billing, was determined to bring about and witness their discovery, reportedly stating that he refused to die before the discovery was made.  The universe was kind to him, so at age 101 he is still around and got his wish.

LIGO signal
LIGO measurement of gravitational waves. Shows the gravitational wave signals received by the LIGO instruments at Hanford, Washington (left) and Livingston, Louisiana (right) and comparisons of these signals to the signals expected due to a black hole merger event.
]]>
http://wavewatching.net/2016/03/07/late-wave/feed/ 0
D-Wave – Fast Enough to Win my Bet? http://wavewatching.net/2015/12/13/d-wave-fast-enough-to-win-my-bet/ http://wavewatching.net/2015/12/13/d-wave-fast-enough-to-win-my-bet/#comments Mon, 14 Dec 2015 02:58:47 +0000 http://wavewatching.net/?p=3658 Continue reading D-Wave – Fast Enough to Win my Bet? ]]> Really Would Like to Get That Raclette Cheese.

Last summer I had to ship a crate of maple syrup to Matthias Troyer at the ETHZ in Switzerland. The conditions we had agreed on for our performance bet were so that, at this point, the D-Wave One could not show a clear performance advantage over a conventional, modern CPU running fine-tuned optimization code. The machine held its own, but there weren't any problem classes to point to that really demonstrated massive performance superiority.

google_benchmark
Impressive benchmark graph. Next on my Christmas wishlist: A decisive widening of the gap between the green QMC curve and the blue D-Wave line as the problem size increases (as is the case when compared to the red Simulated Annealing curve).

 

The big news to take away from the recent Google/D-Wave performance benchmark is that, with certain problem instances, the D-Wave machine clearly shines. 100 million times better in comparison to a Quantum Monte Carlo Simulation is nothing to sneeze at. This doesn't mean that I would now automatically win my bet with Matthias if we were to repeat it with the D-Wave Two, but it'll make it much more interesting for sure.

One advantage of being hard-pressed to find time for blogging is that once I get around to commenting on recent developments, most other reactions are already in. Matthias provided this excellent write-up, and the former D-Wave critic-in-chief remains in retirement. Scott Aaronson's blog entry on the matter strikes a (comparatively) conciliatory tone. One of his comments explains one of the reason for this change:

"[John Martinis] told me that some of the engineering D-Wave had done (e.g., just figuring out how to integrate hundreds of superconducting qubits while still having lines into them to control the couplings) would be useful to his group. That’s one of the main things that caused me to moderate a bit (while remaining as intolerant as ever of hype)."

Scott also gave a pretty balanced interview to the MIT News (although I have to subtract a star on style for working in a dig at Geordie Rose - clearly the two won't become best buds in this lifetime).

Hype is generally and righteously scorned in the scientific community.  And when it is pointed out (for instance when the black hole information loss problem had been "solved"), the scientists involved are usually on the defensive.

Buddy
Buddy the Elf believes anything Steve Jurvetson ever uttered and then some.

Of course, business follows very different rules, more along the Donald Trump rules of attention. Any BS will do as long as it captures audience. Customers are used to these kinds of commercial exaggerations, and so I am always a bit puzzled by the urge to debunk D-Wave "hype". To me it feels almost a bit patronizing. The average Joe is not like Buddy the Elf, the unlikely hero of my family's favorite Christmas movie. When Buddy comes to NYC and sees a diner advertising the world's best coffee,  he takes this at face value and goes crazy over it.  The average Joe, on the other hand, has been thoroughly desensitized to high tech hype. He knows that neither Google Glasses nor Apple Watches will really change his life forever, nor will he believe Steve Jurvetson that the D-Wave machines will outperform the universe within a couple of years. Steve, on the other hand, does what every good VC business man is supposed to do for a company that he invested in, i.e. create hype. The world has become a virtual bazaar, and your statements have to be outrageous and novel in order to be heard over the noise. What he wants to get across is that the D-Wave machines will grow in performance faster than conventional hardware. Condensing this into Rose's Law is the perfect pitch vehicle for that - hype with a clear purpose.

People like to pick an allegiance and cheer for their "side". It is the narrative that has been dominating the D-Wave story for many years, and it made for easy blogging, but I won't miss it. The hypers gonna hype, the haters gonna hate, but now the nerds should know to trust the published papers.

Max Planck famously quipped that science advances one funeral at a time, because even scientists have a hard time acting completely rationally and adjusting their stances when confronted with new data.  This is the 21st century, here's to hoping that the scientific community has lost this kind of rigidity, even while most of humanity remains as tribal as ever.

]]>
http://wavewatching.net/2015/12/13/d-wave-fast-enough-to-win-my-bet/feed/ 7
Riding the D-Wave http://wavewatching.net/2015/09/07/riding-the-d-wave/ http://wavewatching.net/2015/09/07/riding-the-d-wave/#comments Tue, 08 Sep 2015 03:38:39 +0000 http://wavewatching.net/?p=3625 Continue reading Riding the D-Wave ]]> Update: Thanks to everybody who keeps pointing me to relevant news (Ramsey, Rolf, Sol and everybody else my overtired brain may not recall at this time).

There is no doubt that D-Wave is on a role:

And then there's the countdown to what is billed as a D-Wave related watershed announcement from Google coming Dec 8th.  Could this be an early Christmas present to D-Wave investors?

 

~~~~~~

dwavetrain_wide

Back in the day before he re-resigned as D-Wave's chief critic, Scott Aaronson made a well-reasoned argument as to why he thought this academic, and at times vitriolic, scrutiny was warranted. He argued that a failure of D-Wave to deliver a quantum speed-up would set the field back, similar to the AI winter that was triggered by Marvin Minsky's Perceptrons book.

Fortunately, quantum annealers are not perceptrons. For the latter, it can be rigorously proven that single layer perceptrons are not very useful. Ironically, at the time the book was published, multilayered perceptrons, i.e. a concept that is now fundamental to all deep learning algorithms, were already known, but in the ensuing backlash research funding for those also dried up completely. The term "perceptron" became toxic and is now completely extinct.

Could D-Wave be derailed by a proof that shows that quantum annealing could, under no circumstances, deliver a quantum speed-up? To me this seems very unlikely, not only because I expect that no such proof exists, but also because, even if this was the case, there will still be a practical speed-up to be had. If D-Wave manages to double their integration density at the same rapid clip as in the past, then their machines will eventually outperform any classical computing technology in terms of annealing performance. This article (h/t Sol) expands on this point.

So far there is no sign that D-Wave will slow its manic pace. The company recently released its latest chip generation, featuring quantum annealing with an impressive 1000+ qubits (in practice, the number will be smaller, as qubits will be consumed for problem encoding and software EEC). This was followed with a detailed test under the leadership of Catherine McGeoch, and it will be interesting to see what Daniel Lidar, and other researchers with access to D‑Wave machines, will find.

My expectation has been from the get-go that D-Wave will accelerate the development of this emerging industry, and attract more money to the field. It seems to me that this is now playing out.

Intel recently (and finally as Robert Tucci points out) entered the fray with a 1B, snagged by Israel’s QuantX" href="http://www.geektime.com/2015/04/01/largest-seed-round-ever-close-to-1b-snagged-by-israels-quantx/">the 50M dollars from the Chile based Grupo Arcano is an enormous amount for a QC software firm, that as far as I know, holds no patents.

Some astoundingly big bets are now being placed in this field.

 

 

 

 

 

]]>
http://wavewatching.net/2015/09/07/riding-the-d-wave/feed/ 5
Classic Quantum Confusion http://wavewatching.net/2015/05/28/classic-quantum-confusion/ http://wavewatching.net/2015/05/28/classic-quantum-confusion/#comments Thu, 28 May 2015 23:43:53 +0000 http://wavewatching.net/?p=3572 Continue reading Classic Quantum Confusion ]]> Paris_Tuileries_Facepalm_statueBy now I am pretty used to egregiously misleading summarization of physics research in popular science outlets, sometimes flamed by the researchers themselves. Also self-aggrandized, ignorant papers sneaked into supposedly peer reviewed journals by non-physicists are just par for the course.

But this is in a class of it's own.  Given the headline and the introductory statement that "a fully classical system behaves like a true quantum computer", it essentially creates the impression that QC research must be pointless. Much later it sneaks in the obvious, that an analog emulation just like one on a regular computer can't possibly scale past 40 qubits due to the exponential growth in required computational resources.

But that's not the most irritating aspect of this article.

Don't get me wrong, I am a big fan of classical quantum analog systems. I think they can be very educational, if you know what you are looking at (Spreeuw 1998).  The latter paper, is actually quoted by the authors and it is very precise in distinguishing between quantum entanglement and the classical analog. But that's not what their otherwise fine paper posits (La Cour et al. 2015).  The authors write:

"What we can say is that, aside from the limits on scale, a classical emulation of a quantum computer is capable of exploiting the same quantum phenomena as that of a true quantum system for solving computational problems."

If it wasn't for the phys.org reporting, I would put this down as sloppy wording that slipped past peer review, but if the authors are correctly quoted, then they indeed labour under the assumption that they faithfully recreated quantum entanglement in their classical analog computer - mistaking the model for the real thing.

It makes for a funny juxtaposition on phys.org though, when filtering by 'quantum physics' news.

Screenshot 2015-05-28 01.35.43

The second article refers to a new realization of Wheeler's delayed choice experiment (where the non-local entanglement across space is essentially swapped for one across time).

If one takes Brian La Cour at his word then according to his other paper he suggest that these kind of phenomena should also have a classical analog.

So it's not just hand-waving when he is making this rather outlandish sounding statement with regards to being able to achieve an analog to the violation of Bell's inequality:

"We believe that, by adding an emulation of quantum noise to the signal, our device would be capable of exhibiting this type of [Bell's inequality violating] entanglement as well, as described in another recent publication."

Of course talk is cheap, but if this research group could actually demonstrate this Bell's inequality loophole it certainly could change the conversation.

]]>
http://wavewatching.net/2015/05/28/classic-quantum-confusion/feed/ 1
Will Super Cool SQUIDs Make for an Emerging Industry Standard? http://wavewatching.net/2015/05/08/will-super-cool-squids-make-for-an-emerging-industry-standard/ http://wavewatching.net/2015/05/08/will-super-cool-squids-make-for-an-emerging-industry-standard/#comments Fri, 08 May 2015 21:25:28 +0000 http://wavewatching.net/?p=3554 Continue reading Will Super Cool SQUIDs Make for an Emerging Industry Standard? ]]> dwave_log_temp_scale
This older logarithmic (!) D-Wave Graphic gives an idea how extreme the cooling requirement is for SQUID based QC (it used to be part of a really cool SVG animation, but unfortunately D-Wave no longer hosts it).

D‑Wave had to break new ground in many engineering disciplines.  One of them was the cooling and shielding technology required to operate their chip.

To this end they are now using ANSYS software, which of course makes for very good marketing for this company (h/t Sol Warda). So good, in fact, that I would hope D‑Wave negotiated a large discount for serving as an ANSYS reference customer.

Any SQUID based quantum computing chip will have similar cooling and shielding requirements, i.e. Google and IBM will have to go through a similar kind of rigorous engineering exercise to productize their approach to quantum computing, even though this approach may look quite different.

Until recently, it would have been easy to forget that IBM is another contender in the ring for SQUID based quantum computing, yet the company's researchers have been working diligently outside the limelight - they last created headlines three years ago. And unlike other quantum computing news, that often only touts marginal improvements, their recent results deserved to be called a break-through, as they improved upon the kind of hardware error correction that Google is betting on.

IBM_in_atoms
IBM has been conducting fundamental quantum technology research for a long time, this image shows the company's name spelled out using 35 xenon atoms, arranged via a scanning tunneling microscope (a nano visualization and manipulation device invented at IBM).

Obviously, the better your error correction, the more likely you will be able to achieve quantum speed-up when you pursue an annealing architecture like D‑Wave, but IBM is not after yet another annealer. Most articles on the IBM program reports that IBM is into building a  "real quantum computer”, and the term clearly originates from within the company, (e.g. this article attributes the term to Scientists at IBM Research in Yorktown Heights, NY). This leaves little doubt about their commitment to universal gate based QC.

The difference in strategy is dramatic. D‑Wave decided to forgo surface code error correction on the chip in order to get a device to the market.  Google, on the other hand, decided to snap up the best academic surface code implementation money could buy, and also emphasized speed-to-market by first going for another quantum adiabatic design.

All the while, IBM researchers first diligently worked through the stability of SQUID based qubits .  Even now, having achieved the best available error correction, they clearly signaled that they don't consider it good enough for scale-up. It may take yet another three years for them to find the optimal number and configuration of logical qubits that achieves the kind of fidelity they need to then tackle an actual chip.

It is a very methodological engineering approach. Once the smallest building block is perfected,  they will have the confidence that they can go for the moonshot. It's also an approach that only a company with very deep pockets can afford, one with a culture that allows for the pursuit of a decades long research program.

Despite the differences, in the end, all SQUID based chips will have to be operated very close to absolute zero.  IBM's error correction may eventually give it a leg-up over the competition, but I doubt that standard liquid helium fridge technology will suffice for a chip that implements dozens or hundreds of qubits.

By the time IBM enters the market there will be more early adopters of the D‑Wave and Google chips, and the co-opetition between these two companies may have given birth to an emerging industry standard for the fridge technology. In a sense, this may lower the barriers of entry for new quantum chips if the new entrant can leverage this existing infrastructure. It would probably be a first for IBM to cater to a chip interfacing standard that the company did not help to design.

So while there's been plenty of news in the quantum computing hardware space to report, it is curious, and a sign of the times, that a recent Washington Post article on the matter opted to headline with a Quantum Computing Software company i.e. QxBranch. (Robert R. Tucci channeled the journalists at the WP when he wrote last week that the IBM news bodes well for software start-ups in this space).

While tech and business journalists may not (and may possibly never) understand what makes a quantum computer tick, they understand perfectly well that any computing device is just dead weight without software, and that the latter will make the value proposition necessary to create a market for these new machines.

 

 

 

]]>
http://wavewatching.net/2015/05/08/will-super-cool-squids-make-for-an-emerging-industry-standard/feed/ 0
We need Big Data where it will actually make a difference http://wavewatching.net/2015/04/26/we-need-big-data-where-it-will-actually-make-a-difference/ http://wavewatching.net/2015/04/26/we-need-big-data-where-it-will-actually-make-a-difference/#comments Sun, 26 Apr 2015 17:59:44 +0000 http://wavewatching.net/?p=3542 Continue reading We need Big Data where it will actually make a difference ]]> Katmadu_detroyed_temples

Another earthquake took the lives of many thousands. As I am writing this blog post, scores of survivors will still be trapped underneath debris and rubble.

It will take weeks, if not months, before the damage done to Nepal will become fully apparent, in terms of life and limbs but also economically and spiritually.

The world's poorest regions are often hardest hit because resilient structures that can withstand quakes of this magnitude are expensive.

Governments look to science to provide better earthquake warnings, but the progress of geophysical modeling is hampered by the lack of good, high quality data.

In this context, pushing the limits of remote sensing with new technologies such as Quantum Gravimeters becomes a matter of life and death, and it should make apparent that striving for ever more precise quantum clocks is anything but a vanity chase. After all we are just now closing in on the the level of accuracy needed to perform relativistic geodesy.

It goes without saying that the resource extraction industry will be among the first to profit from these new techniques.  While this industry has an image problem due to its less than stellar environmental track record, there's no denying that anything that drives the rapid and ongoing productization of these technologies is a net positive if that makes them affordable and widely accessible to geophysicists who study the dynamic of active fault lines. Acquiring this kind of big data is the only chance to ever achieve a future when our planet will no longer shock us with its deadly geological force.

]]>
http://wavewatching.net/2015/04/26/we-need-big-data-where-it-will-actually-make-a-difference/feed/ 2
How many social networks do you need? http://wavewatching.net/2015/04/22/how-many-social-networks-do-you-need/ http://wavewatching.net/2015/04/22/how-many-social-networks-do-you-need/#comments Thu, 23 Apr 2015 02:41:31 +0000 http://wavewatching.net/?p=3525 Continue reading How many social networks do you need? ]]> The proliferation of social networks seems unstoppable now. Even the big ones you can no longer count on one hand: Facebook, LinkedIn, GooglePlus, Twitter, Instagram, Tumblr, Pinterest, Snapchat - I am so uncool I didn't even know about the latter until very recently. It seems that there has to be a natural saturation point with diminishing marginal return of signing up to yet another one, but apparently we are still far from it.

Recently via LinkedIn I learned about a targeted social network that I happily signed up for, which is quite against my character (i.e. I still don't have a Facebook account).

iQEi_logo
Free to join and no strings attached. (This targeted social network is not motivated by a desire to monetize your social graph).

The aptly named International Quantum Exchange for Innovation is a social network set up by DK Matai with the express purpose of bringing together people of all walks of life anywhere on this globe who are interested in the next wave of the coming Quantum Technology revolution. If you are as much interested in this as I am, then joining this UN of Quantum Technology, as DK puts it, is a no-brainer.

The term 'revolution' is often carelessly thrown around, but in this case I think, when it comes to the new wave of quantum technologies, it is more than justified. After all, the first wave of QM driven technologies powered the second leg of the  Industrial Revolution. It started with a bang, in the worst possible manner, when the first nuclear bomb ignited, but the new insights gained led to a plethora of new high tech products.

Quantum physics was instrumental in everything from solar cells, to lasers, to medical imaging (e.g. MRI) and of course, first and foremost, the transistor. As computers became more powerful, Quantum Chemistry coalesced into an actual field, feeding on the ever increasing computational power. Yet Moore's law proved hardly adequate for its insatiable appetite for the compute cycles required by the underlying quantum numerics.

During Richard Feynman's (too short) life span, he was involved in the military as well as civilian application of quantum mechanics, and his famous "there is plenty of room at the bottom" talk can be read as a programmatic outline of the first Quantum Technology revolution.  This QT 1.0 wave has almost run its course. We made our way to the bottom, but there we encountered entirely new possibilities by exploiting the essential, counter-intuitive non-localities of quantum mechanics.  This takes it to the next step, and again Information Technology is at the fore-front. It is a testament to Feynman's brilliance that he anticipated QT 2.0 as well, when suggesting a quantum simulator for the first time, much along the lines of what D-Wave built.

It is apt and promising that the new wave of quantum technology does not start with a destructive big bang, but an intriguing and controversial black box.

D-Wave_2001

 

]]>
http://wavewatching.net/2015/04/22/how-many-social-networks-do-you-need/feed/ 0
Dumbing Down for Smartphones http://wavewatching.net/2015/04/20/dumbing-down-for-smartphones/ http://wavewatching.net/2015/04/20/dumbing-down-for-smartphones/#comments Mon, 20 Apr 2015 05:54:19 +0000 http://wavewatching.net/?p=3522 Google changed its site ranking, if a site is not mobile friendly it will now be heavily penalized. I was quite fond of my old design but when running the Google Mobile test it failed miserably.  Hence a hasty redesign based on a newer WordPress theme was in order.

Screenshot 2015-04-20 01.43.20
Goodby my beloved theme, Google and that nasty smartphone killed you.

 

 

]]>
http://wavewatching.net/2015/04/20/dumbing-down-for-smartphones/feed/ 2
Quantum Computing Road Map http://wavewatching.net/2015/04/16/quantum-computing-road-map/ http://wavewatching.net/2015/04/16/quantum-computing-road-map/#comments Thu, 16 Apr 2015 04:43:40 +0000 http://wavewatching.net/?p=3489 Continue reading Quantum Computing Road Map ]]> No, we are not there yet, but we are working on it.Qubit spin states in diamond defects don't last forever, but they can last outstandingly long even at room temperature (measured in microseconds which is a long time when it comes to computing).

So this is yet another interesting system added to the list of candidates for potential QC hardware.

Nevertheless, when it comes to the realization of scalable quantum computers, qubits decoherence time may very well be eclipsed by the importance of another time span: 20 years, the length at which patents are valid (in the US this can include software algorithms).

With D-Wave and Google leading the way, we may be getting there faster than most industry experts predicted. Certainly the odds are very high that it won't take another two decades for useable universal QC machines to be built.

But how do we get to the point of bootstrapping a new quantum technology industry? DK Matai addressed this in a recent blog post, and identified five key questions, which I attempt to address below (I took the liberty of slightly abbreviating the questions, please check at the link for the unabridged version).

The challenges DK laid out will require much more than a blog post (or a LinkedIn comment that I recycled here), especially since his view is wider than only Quantum Information science. That is why the following thoughts are by no means comprehensive answers, and very much incomplete, but they may provide a starting point.

1. How do we prioritise the commercialisation of critical Quantum Technology 2.0 components, networks and entire systems both nationally and internationally?

The prioritization should be based on the disruptive potential: Take quantum cryptography versus quantum computing for example. Quantum encryption could stamp out fraud that exploits some technical weaknesses, but it won't address the more dominant social engineering deceptions. On the upside it will also facilitate iron clad cryptocurrencies. Yet, if Feynman’s vision of the universal quantum simulator comes to fruition, we will be able to tackle collective quantum dynamics that are computationally intractable with conventional computers. This encompasses everything from simulating high temperature superconductivity to complex (bio-)chemical dynamics. ETH’s Matthias Troyer gave an excellent overview over these killer-apps for quantum computing in his recent Google talk, I especially like his example of nitrogen fixation. Nature manages to accomplish this with minimal energy expenditure in some bacteria, but industrially we only have the century old Haber-Bosch process, which in modern plants still results in 1/2 ton of CO2 for each ton of NH3. If we could simulate and understand the chemical pathway that these bacteria follow we could eliminate one of the major industrial sources of carbon dioxide.

2. Which financial, technological, scientific, industrial and infrastructure partners are the ideal co-creators to invent, to innovate and to deploy new Quantum technologies on a medium to large scale around the world? 

This will vary drastically by technology. To pick a basic example, a quantum clock per se is just a better clock, but put it into a Galileo/GPS satellite and the drastic improvement in timekeeping will immediately translate to a higher location triangulation accuracy, as well as allow for a better mapping of the earth's gravitational field/mass distribution.

3. What is the process to prioritise investment, marketing and sales in Quantum Technologies to create the billion dollar “killer apps”?

As sketched out above, the real price to me is universal quantum computation/simulation. Considerable efforts have to go into building such machines, but that doesn't mean that you cannot start to already develop software for them. Any coding for new quantum platforms, even if they are already here (as in the case of the D-Wave 2) will involve emulators on classical hardware, because you want to debug and proof your code before submitting it to the more expansive quantum resource. In my mind building such an environment in a collaborative fashion to showcase and develop quantum algorithms should be the first step. To me this appears feasible within an accelerated timescale (months rather than years). I think such an effort is critical to offset the closed sourced and tightly license controlled approach, that for instance Microsoft is following with its development of the LIQUi|> platform.

4. How do the government agencies, funding Quantum Tech 2.0 Research and Development in the hundreds of millions each year, see further light so that funding can be directed to specific commercial goals with specific commercial end users in mind?

This to me seems to be the biggest challenge. The amount of research papers produced in this field is enormous. Much of it is computational theory. While the theory has its merits, I think the governmental funding should try to emphasize programs that have a clearly defined agenda towards ambitious yet attainable goals. Research that will result in actual hardware and/or commercially applicable software implementations (e.g. the UCSB Martinis agenda). Yet, governments shouldn't be in the position to pick a winning design, as was inadvertently done for fusion research where ITER’s funding requirements are now crowding out all other approaches. The latter is a template for how not to go about it.

5. How to create an International Quantum Tech 2.0 Super Exchange that brings together all the global centres of excellence, as well as all the relevant financiers, customers and commercial partners to create Quantum “Killer Apps”?

On a grassroots level I think open source initiatives (e.g. a LIQUiD alternative) could become catalysts to bring academic excellence centers and commercial players into alignment. This at least is my impression based on conversations with several people involved in the commercial and academic realm. On the other hand, as with any open source products, commercialization won’t be easy, yet this may be less of a concern in this emerging industry, as the IP will be in the quantum algorithms, and they will most likely be executed with quantum resources tied to a SaaS offering.

 

]]>
http://wavewatching.net/2015/04/16/quantum-computing-road-map/feed/ 1