Category Archives: Quantum Cryptography

Quantum Computing Road Map

No, we are not there yet, but we are working on it.Qubit spin states in diamond defects don’t last forever, but they can last outstandingly long even at room temperature (measured in microseconds which is a long time when it comes to computing).

So this is yet another interesting system added to the list of candidates for potential QC hardware.

Nevertheless, when it comes to the realization of scalable quantum computers, qubits decoherence time may very well be eclipsed by the importance of another time span: 20 years, the length at which patents are valid (in the US this can include software algorithms).

With D-Wave and Google leading the way, we may be getting there faster than most industry experts predicted. Certainly the odds are very high that it won’t take another two decades for useable universal QC machines to be built.

But how do we get to the point of bootstrapping a new quantum technology industry? DK Matai addressed this in a recent blog post, and identified five key questions, which I attempt to address below (I took the liberty of slightly abbreviating the questions, please check at the link for the unabridged version).

The challenges DK laid out will require much more than a blog post (or a LinkedIn comment that I recycled here), especially since his view is wider than only Quantum Information science. That is why the following thoughts are by no means comprehensive answers, and very much incomplete, but they may provide a starting point.

1. How do we prioritise the commercialisation of critical Quantum Technology 2.0 components, networks and entire systems both nationally and internationally?

The prioritization should be based on the disruptive potential: Take quantum cryptography versus quantum computing for example. Quantum encryption could stamp out fraud that exploits some technical weaknesses, but it won’t address the more dominant social engineering deceptions. On the upside it will also facilitate iron clad cryptocurrencies. Yet, if Feynman’s vision of the universal quantum simulator comes to fruition, we will be able to tackle collective quantum dynamics that are computationally intractable with conventional computers. This encompasses everything from simulating high temperature superconductivity to complex (bio-)chemical dynamics. ETH’s Matthias Troyer gave an excellent overview over these killer-apps for quantum computing in his recent Google talk, I especially like his example of nitrogen fixation. Nature manages to accomplish this with minimal energy expenditure in some bacteria, but industrially we only have the century old Haber-Bosch process, which in modern plants still results in 1/2 ton of CO2 for each ton of NH3. If we could simulate and understand the chemical pathway that these bacteria follow we could eliminate one of the major industrial sources of carbon dioxide.

2. Which financial, technological, scientific, industrial and infrastructure partners are the ideal co-creators to invent, to innovate and to deploy new Quantum technologies on a medium to large scale around the world? 

This will vary drastically by technology. To pick a basic example, a quantum clock per se is just a better clock, but put it into a Galileo/GPS satellite and the drastic improvement in timekeeping will immediately translate to a higher location triangulation accuracy, as well as allow for a better mapping of the earth’s gravitational field/mass distribution.

3. What is the process to prioritise investment, marketing and sales in Quantum Technologies to create the billion dollar “killer apps”?

As sketched out above, the real price to me is universal quantum computation/simulation. Considerable efforts have to go into building such machines, but that doesn’t mean that you cannot start to already develop software for them. Any coding for new quantum platforms, even if they are already here (as in the case of the D-Wave 2) will involve emulators on classical hardware, because you want to debug and proof your code before submitting it to the more expansive quantum resource. In my mind building such an environment in a collaborative fashion to showcase and develop quantum algorithms should be the first step. To me this appears feasible within an accelerated timescale (months rather than years). I think such an effort is critical to offset the closed sourced and tightly license controlled approach, that for instance Microsoft is following with its development of the LIQUi|> platform.

4. How do the government agencies, funding Quantum Tech 2.0 Research and Development in the hundreds of millions each year, see further light so that funding can be directed to specific commercial goals with specific commercial end users in mind?

This to me seems to be the biggest challenge. The amount of research papers produced in this field is enormous. Much of it is computational theory. While the theory has its merits, I think the governmental funding should try to emphasize programs that have a clearly defined agenda towards ambitious yet attainable goals. Research that will result in actual hardware and/or commercially applicable software implementations (e.g. the UCSB Martinis agenda). Yet, governments shouldn’t be in the position to pick a winning design, as was inadvertently done for fusion research where ITER’s funding requirements are now crowding out all other approaches. The latter is a template for how not to go about it.

5. How to create an International Quantum Tech 2.0 Super Exchange that brings together all the global centres of excellence, as well as all the relevant financiers, customers and commercial partners to create Quantum “Killer Apps”?

On a grassroots level I think open source initiatives (e.g. a LIQUiD alternative) could become catalysts to bring academic excellence centers and commercial players into alignment. This at least is my impression based on conversations with several people involved in the commercial and academic realm. On the other hand, as with any open source products, commercialization won’t be easy, yet this may be less of a concern in this emerging industry, as the IP will be in the quantum algorithms, and they will most likely be executed with quantum resources tied to a SaaS offering.

 

Coming Up Swinging

The current top political news of the day (Snowden leak) brings into sharp relief why encryption and the capabilities to break it receive so much attention.

It puts into context why a single algorithm (Shor’s) accounts for most of quantum computing’s notoriety and why quantum encryption receives so much funding.

All the more surprising then that Laszlo Kish’s work received comparatively little attention.  After all, it poses a direct challenge to the field’s claim to offer the only communication channels where being tamper-proof is baked into the very protocol.  Yet, with Charles Bennett et al. going for a knock-out, this decidedly changed.  Now this exciting ‘match’ goes to the next round with the Kish et al. follow up paper, and it is quite the come-back.  From the abstract:

Recently, Bennett and Riedel (BR) argued that thermodynamics is not essential in the Kirchhoff-law–Johnson-noise (KLJN) classical physical cryptographic exchange method in an effort to disprove the security of the KLJN scheme. They attempted to prove this by introducing a dissipation-free deterministic key exchange method with two batteries and two switches. In the present paper, we first show that BR’s scheme is unphysical and that some elements of its assumptions violate basic protocols of secure communication. Furthermore we crack the BR system with 100% success via passive attacks, in ten different ways, and demonstrate that the same cracking methods do not function for the KLJN scheme that employs Johnson noise to provide security underpinned by the Second Law of Thermodynamics. We also present a critical analysis of some other claims by BR; for example, we prove that their equations for describing zero security do not apply to the KLJN scheme. Finally we give mathematical security proofs for each of the attacks on the BR scheme and conclude that the information theoretic (unconditional) security of the KLJN method has not been successfully challenged.

The original post on this subject resulted in a high quality follow-up discussion on LinkedIn that I hope may get triggered again.  After all, science is more fun as a spectator sport with well-informed running commentary.

Septimana Mirabilis – Major Quantum Information Technology Breakthroughs

Update 4: The award for the funniest photo commentary on this imbroglio goes to Robert Tucci.

Update 3: Congratulations to D-Wave for their  recent sale of the D-Wave Two machine to  the non-profit Space Research Association  – to be used collaboratively by Google and NASA. (h/t Geordie Rose)

Update 2: Scott Aaronson finally weighs in, and as Robert Tucci predicted in the comments, he resumed his sceptical stance.

Update: Link to Catherine McGeoch and Cong Wang’s paper.

D-Wave Cooper-pair states in real space. The company that derived it's name from this now makes some major waves of its own.
D-Wave Cooper-pair states in real space. Now the company that derived its name from this wavefunction makes some waves of its own.

What a week for Quantum Information Science. D-Wave made some major news when the first peer reviewed paper to conclusively demonstrate that their machine can drastically outperform conventional hardware was recently announced.  It’s hardly a contest.  For the class of optimization problems that the D-Wave machines are designed for, the algorithms executed on the conventional chip didn’t even come close. The D-Wave machine solved some of the tested problems about 3600 times faster than the best conventional algorithm. (I’ll leave it to gizmodo to not mince words).

Apparently, my back of the envelope calculation from last year, that was based on the D-Wave One performance of a brute force calculation of Ramsey numbers, wasn’t completely off.  Back then I calculated that the 128 qubit chip performed at the level of about 300 Intel i7 Hex CPU cores (the current test ran on the next generation 512 qubit chip). So, I am now quite confident in my ongoing bet.

If conventional hardware requires thousands of conventional cores to beat the current D-Wave machine, then the company has certainly entered a stage where its offering becomes attractive to a wider market.  Of course, other factors will weigh in when considering total cost of ownership.  The biggest hurdle in this regard will be software, as to date any problem you want to tackle the D-Wave way requires dedicated coding for this machine.  At first these skills will be rare and expansive to procure. On the other hand, there are other cost factors working in D-Wave’s favor:  Although I haven’t seen power consumption numbers, the adiabatic nature of the chip’s architecture suggests that it will require far less wattage than a massive server farm or conventional super-computer.  Ironically, while the latter operate at normal ambient temperature they will always require far more cooling effort to keep them at this temp than the D-Wave chips in their deep freeze vacuum.

That the current trajectory of our supercomputer power consumption is on an unsustainable path should be obvious by simply glancing at this chart.

Despite the efforts there are hard efficiency limits for conventional CMOS transistors. (for the original pingdom.com article click image)

D-Wave matures just at the right time to drive a paradigm change, and I hope they will pursue this opportunity aggressively.

But wait, there’s more.  This week was remarkable in unveiling yet another major breakthrough for Quantum Information technology: At Los Alamos National Labs, an Internet scalable quantum cryptographic network has been operating without a hitch for the last two years.  Now there’s an example for research that will “directly benefit the American people” (something that should please Congressman Lamar Smith, the current controversial chairman of the House of Representatives Committee on Science).

Why it took two years for this news to be published is anybody’s guess. Did somebody just flip a switch and then forget about it? Probably more likely that this research has been considered classified for some time.

Certainly this also suggests a technology who’s time has come.  Governmental and enterprise networks have been compromised at increasing rates, even causing inflammatory talk of ongoing cyber warfare. And while there have been commercial quantum encryption devices on the market for quite some time now, these have been limited to point to point connections.  Having a protocol that allows the seamless integration of quantum cryptography into the existing network stack raises this to an entirely different level.  This is of course no panacea against security breaches, and has been criticized as providing superfluous security illusions, since the social engineering attacks clearly demonstrate the human users as the weakest link. Nevertheless, I maintain that it has the potential to relegate brute force attacks to history’s dustbin.

The new quantum protocol uses a typical “hub-and-spoke” topology as illustrated in the following figure and explained in more detail in the original paper.

Network-Centric Quantum Communications with Application to Critical  Infrastructure Protection Topology
The NQC topology maps well onto those widely encountered in optical fiber networks, and permits a hierarchical trust architecture for a “hub” to act as the trusted authority (TA, “Trent”) to facilitate quantum authenticated key exchange.

Another key aspect is the quantum resource employed in the network:

The photonic phase-based qubits typically used in optical fiber QC require interferometric stability and inevitably necessitate bulky and expensive hardware. Instead, for NQC we use polarization qubits, allowing the QC transmitters – referred to as QKarDs – to be miniaturized and fabricated using integrated photonics methods [12]. This opens the door to a manufacturing process with its attendant economy of scale, and ultimately much lower-cost QC hardware.

It will be interesting to observe how quickly this technology will be commercialized, and if the US export restriction on strong cryptography will hamper the global adoption.

If a Fighter Writes a Paper to go for the Kill …

You don’t want to take on this man in the rink:

And you don’t want to take on his namesake in the scientific realm.

In my last post I wrote about the Kish Cypher protocol, and was wondering about its potential to supplant Quantum Cryptography.

The very same same day, as if custom ordered, this fighter’s namesake, no other than Charles Bennett himself, published this pre-print paper (h/t Alessandro F.).

It is not kind on the Kish Cipher protocol, and that’s putting it mildly.  To quote from the abstract (emphasis mine):

We point out that arguments for the security of Kish’s noise-based cryptographic protocol have relied on an unphysical no-wave limit, which if taken seriously would prevent any correlation from developing between the users. We introduce a noiseless version of the protocol, also having illusory security in the no-wave limit, to show that noise and thermodynamics play no essential role. Then we prove generally that classical electromagnetic protocols cannot establish a secret key between two parties separated by a spacetime region perfectly monitored by an eavesdropper. We note that the original protocol of Kish is vulnerable to passive time-correlation attacks even in the quasi-static limit.

Ouch.

The ref’s counting …

Quantum Cryptography Made Obsolete?

The background story.

Electrical engineering is often overshadowed by other STEM fields. Computer Science is cooler, and physics has the aura of the Faustian quest for the most fundamental truths science can uncover.  Yet, this discipline produced a quite remarkable bit of research with profound implications for Quantum Information Science.  It is not very well publicized. Maybe that is because it’s a bit embarrassing to the physicists and computer scientists who are heavily vested in Quantum Cryptography?

After all, the typical, one-two punch elevator-pitch for QIS is entirely undermined by it. To recap, the argument goes likes this:

  1. Universal Quantum Computing will destroy all effective cryptography as we know it.
  2. Fear not, for Quantum Cryptography will come to your rescue.

Significant funds went into the latter.  And it’s not like there isn’t some significant progress, but what if all this effort proved futile because an equally strong encryption could be had with far more robust methods?  This is exactly what the Kish Cypher protocol promises. It has been around for several years, and in a recent paper, Laszlo Bela Kish discusses several variations of his protocol that he modestly calls the Kirchhoff-Law-Johnson-(like)-Noise (KLJN) secure key exchange – although otherwise it goes by his name in the literature. A 2012 paper that describes the principle behind it can be found here.  The abstract of the latter makes no qualms about the challenge to Quantum Information Science:

It has been shown recently that the use of two pairs of resistors with enhanced Johnson-noise and a Kirchhoff-loop—i.e., a Kirchhoff-Law-JohnsonNoise (KLJN) protocol—for secure key distribution leads to information theoretic security levels superior to those of a quantum key distribution, including a natural immunity against a man-in-the-middle attack. This issue is becoming particularly timely because of the recent full cracks of practical quantum communicators, as shown in numerous peer-reviewed publications.

There are some commonalities between quantum cryptography and this alternative, inherently safe, protocol.  The obvious one is that they are both key exchange schemes; The more interesting one is that they both leverage fundamental physics properties of the systems that they are employing.  In one case, it is the specific quantum correlations of entangled qubits, in the other, the correlations in classical thermodynamic noise (i.e. the former picks out the specific quantum entanglement correlations of the systems density matrix, the latter only requires the classical entries that remain after decoherence and tracing of the density matrix).

Since this protocol works in the classical regime, it shouldn’t come as a surprise that the implementation is much easier to accomplish than when having to accomplish and preserve an entangled state. The following schematic illustrates the underlying principle:

Core of the KJLN secure key exchange system. Alice encodes her message by connecting these two resistors to the wire in the required sequence. Bob, on the other hand, connects his resistors to the wire at random.

The recipient (Bob) connects the wire at random in predefined synchronicity with the sender (Alice).  The actual current and voltage through the wire is random, ideally Johnson noise. The resistors determine the characteristic of this voltage, Bob can determine what resistor Alice used because he knows which one he connected, but the Fluctuation Dissipation Theorem ensures that wire-tapping by an attacker (Eve) is futile.  The noise characteristics of the signal ensure that no information can be extracted from it.

Given that the amount of effort and funding that goes into Quantum Cryptography is substantial (some even mock it as a distraction from the ultimate prize which is quantum computing), it seems to me that the fact that classic thermodynamic resources allow for similar inherent security should give one pause.  After all, this line of research may provide a much more robust approach to the next generation,”Shor safe”, post quantum encryption infrastructure.