Category Archives: Quantum Computing

D-Wave Withdrawal Relief

AAFor anybody needing an immediate dose of D-Wave news, Wired has this long, well researched article (Robert R. Tucci summarized it in visual form on his blog). It strikes a pretty objective tone, yet I find the uncritical acceptance of Scott Aaronson’s definition of quantum productivity a bit odd.  As a theorist, Scott is only interested in quantum speed-up. That kind of tunnel vision is not inappropriate for his line of work, just an occupational hazard that goes with the job, but it doesn’t make for a complete picture.

Other than that, the article only has some typical minor problems with QM.

At this point, you don’t really expect a journalist to get across how gate model quantum algorithms work, and the article actually does this better than most. But the following bit is rather revealing; The writer, Clive Thompson, describes visually inspecting the D-Wave chip:

Peering in closely, I can just make out the chips, each about 3 millimeters square. The niobium wire for each qubit is only 2 microns wide, but it’s 700 microns long. If you squint very closely you can spot one: a piece of the quantum world, visible to the naked eye.

SQUIDs for magnetometers don’t have to be very small. (Photo approximately to scale – as indicated by the handwriting on this lab sample). This is because for this application you want to measure the magnetic flux encompassed by the loop.

Innocuous enough quote, and most physicists wouldn’t find anything wrong with it either, but therein lies the rub. SQUIDs can be fairly large (see photo to the right).

Any superconducting coil can harbour a coherent quantum state, and they can be huge.

The idea that quantum mechanics somehow only governs the microcosm has been with us from its inception, because that’s what was experimentally accessible at the time i.e. atomic spectra.  But it is a completely outdated notion.

This is something I only fully came to grasp after reading Carvar Maed’s brilliant little book on Collective Electrodynamics. In it, he makes a very compelling case that we are due for another paradigm change.  To me, the latter means dusting off some of Schrödinger’s original wave mechanics ideas. If we were to describe a simple quantum algorithm using that picture, there’s a much better chance to give non-physicists an idea of how these computation schemes work.

 

The Science Newscycle

As life keeps me otherwise busy, I am again late in finishing my next blog post, but in the meantime this web comic nicely summarizes much of the news dynamics of science in general and quantum computing in particular (h/t my lovely wife Sara).

“The Science Newscycle” by Jorge Cham www.phdcomics.com

 

He Said She Said – How Blogs are Changing the Scientific Discourse

The debate about D-Wave‘s “quantumness” shows no signs of abating, hitting a new high note with the company being prominently featured on Time magazine’s recent cover, prompting a dissection of the article on Scott Aaronson’s blog. This was quickly followed by yet another scoop: A rebuttal by Umesh Vazirani to Geordie Rose who recently blogged about the Vazirani et al. paper which sheds doubt on D-Wave’s claim to implement quantum annealing. In his take on the Time magazine article Scott bemoans the ‘he said she said’ template of journalism which gives all sides equal weight, while acknowledging that the Times author Lev Grossman quoted him correctly, and obviously tries to paint an objective picture.

If I had to pick the biggest shortcoming of the Times article, my choice would have been different. I find Grossman entirely misses Scott’s role in this story by describing him as “one of the closest observers of the controversy“.

Scott isn’t just an observer in this. For better or worse he is central to this controversy. As far as I can tell, his reporting on D-Wave’s original demo is what started it to begin with. Unforgettable, his inspired comparison of the D-Wave chip to a roast beef sandwich, which he then famously retracted when he resigned as D-Wave’s chief critic. The latter is something he’s done with some regularity, first when D-Wave started to publish results, then after visiting the company and most recently after the Troyer et al. pre-print appeared in arxiv (although the second time doesn’t seem to count, since it was just a reiteration of the first resignation).

And the say sandwiches and chips go together ...Scott’s resignations never seem to last long. D-Wave has a knack for pushing his buttons. And the way he engages D-Wave and associated research is indicative of a broader trend in how blogs are changing the scientific discourse.

For instance, when Catherine McGeoch gave a talk about her benchmarking of the DW2, Scott did not immediately challenge her directly but took to his blog (a decision he later regretted and apologized for). Anybody who has spent more than five minutes on a Web forum knows how the immediate, yet text only, communication removes inhibitions and leads to more forceful exchanges. In the scientific context, this has the interesting effect of colliding head on with the more lofty perception of a scientist.

It used to be that arguments were only conducted via scientific publications, in person such as in scientific seminars, or the occasional letter exchange. It’s interesting to contemplate how corrosive the arguments between Bohr and Einstein may have turned out, if they would have been conducted via blogs rather than in person.

But it’s not all bad. In the olden days, science could easily be mistaken for a bloodless intellectual game, but nobody could read through the hundreds of comments on Scott’s blog that day and come away with that impression. To the contrary, the inevitable conclusion will be that science arguments are fought with no less passion than the most heated bar brawl.

During this epic blog ‘fight’ Scott summarized his preference for the media thusly

“… I think this episode perfectly illustrates both the disadvantages and the advantages of blogs compared to face-to-face conversation. Yes, on blogs, people misinterpret signals, act rude, and level accusations at each other that they never would face-to-face. But in the process, at least absolutely everything gets out into the open. Notice how I managed to learn orders of magnitude more from Prof. McGeoch from a few blog comments, than I did from having her in the same room …”

it is by far not the only controversy that he courted, nor is this something unique to his blog. Peter Woit continues the heretical work he started with his ‘Not Even Wrong’ book, Robert R. Tucci fiercely defends his quantum algorithm work when he feels he is not credited, Sabine Hossenfelder had to ban a highly qualified String theory troll due to his nastiness (she is also a mum of twins, so you know she has practice in being patient, and it’s not like she doesn’t have a good sense of humor). But my second favorite science blog fight also occurred on Scott’s blog when Joy Christian challenge him to a bet to promote his theory that supposedly invalidates the essential non-locality of quantum mechanics due to Bell’s theorem.

It’s instructive to look at the Joy Christian affair and ask how a mainstream reporter could have possibly reported it. Not knowing Clifford algebra, what could a reporter do but triangulate the expert opinions? There are some outspoken smart critics that point to mistakes in Joy Christian’s reasoning, yet he claims that these are based on flawed understanding and have been repudiated. The reporter will also note that doubting Bell’s theorem is very much a minority position, yet such a journalist not being able to check the math himself can only fall back on the ‘he said she said’ template. After all, this is not a simple straight forward fact like reporting if UN inspectors found Saddam Hussein’s weapons of mass distractions or not (something that surprisingly most mainstream media outside the US accomplished just fine). One cannot expect a journalist to settle an open scientific question.

The nature of the D-Wave story isn’t different, how is Lev Grossman supposed to do anything but report the various stances on each side of the controversy? A commenter at Scott’s blog was dismissively pointing out that he doesn’t even have a science degree. As if this were to make any difference, it’s not like everybody else on each side of the story doesn’t boast such degrees (non-PhDs are in the minority at D-Wave).

Mainstream media reports as they always did, but unsettled scientific questions are the exception to the rule, one of the few cases when ‘he said she said’ journalism is actually the best format. For everything else we fortunately now have the blogs.

One Video to Rule Them All

Updated below

This is essentially an extended update to my last D-Wave post. Rather than stick it there, I think it is important enough to merit its own post.  The reason being, I wish I could make anybody who plans on writing anything on D-Wave first watch the video below from the first Q+ Google+ hang-out this year.

It summarizes the results of the paper I blogged about in my last post on the matter. Ultimately, it answers what is objectively known about D-Wave’s machine based on the analyzed data, and sets out to answer three questions.

  1. Does the machine work?
  2. Is is quantum or classical?
  3. Is it faster than a classical computer?

The short version is

  1. Yes
  2. Based on their modeling D-Wave 2 is indeed a true Quantum Annealer.
  3. While it can beat an off the shelf solver it cannot (yet) outperform on average a highly targeted hand-crafted classical algorithm.

Of course there is much more in the video, and I highly recommend watching the whole thing. It comes with a good introduction to the subject, but if you only want the part about the test, you may want to skip 11 minutes into the video (this way you also cut out some of the cheap shots at completely clueless popular media reports – an attempt at infusing some humor into the subject that may or may not work for you).

 

With regards to point (2) the academic discussion is not settled. A paper with heavyweight names on it just came out (h/t Michael Bacon). It proposes a similar annealing behavior could be accomplished with a classical set-up after all.  Too me this is truly academic in the best and worst sense i.e. a considerable effort to get all the i’s dotted and the t’s crossed.  It simply seems a bit far fetched that the company would set out to build a chip with coupled qubits that behave like a quantum annealer, yet somehow end up with an oddly behaving classical annealer.

From my point of view it is much more interesting to explore all the avenues that are open to D-Wave to improve their chip, such as this new paper on strategies for a quantum annealer to increase the success probability for hard optimization problems. (h/t Geordie Rose).

Update 1

Geordie Rose weighs in on the paper that claims that the D-Wave machine can be explained classically.  He expected a Christmas present and felt he only got socks …

Update 2

Helmut Katzgraber et al. propose in this paper that the current benchmarks are using the wrong problem set to possibly find a quantum speed-up with D-Wave’s machine.

Scott Aaronson (again) resigns as chief D-Wave critic and endorses their experiments

An exercise in positive spin.

Update below.

The English language is astoundingly malleable. It feels almost as if it was tailor made for marketing spin. I noticed this long ago (feels like a lifetime) when working in a position that required me to sell software. Positioning products was much easier when I spoke English.  Mind you, I never told a blatant lie, but I certainly spun the facts to put our product in the best light, and if a customer committed I’d do my darnedest to deliver the value that I promised. The kind of customers I dealt with were of course aware of this dance, and perfectly capable of performing their due diligence. From their perspective, in the end, it is always about buying into the vision, knowing full well that a cutting edge technology, one that will give a real competitive benefit, will of course be too new to be without risk.

During the courting of the customers, any sales person worth their salt will do anything to make the product look as good as possible. One aspect of this is of course to stress positive things that others are saying about your offerings.

To accomplish this, selective quoting can come in very handy. For instance, after reviewing the latest pre-print paper that looks at D-Wave’s 503 qubit chip performance, Scott Aaronson stepped down for the second time as chief D-Wave critic. In the blog post where he announced this, he also observed that on “the ~10% of instances on which the D-Wave machine does best, (…) the machine does do slightly better (…) than simulated annealing”.

This puts in words what the following picture shows in much more detail.

Screenshot 2014-01-18 17.47.52
Instance-by-instance comparison of annealing times and wall-clock times. Shown is a scatter plot of the pure annealing time for the DW2 compared to a simulated classical annealer (SA) using an average over 16 gauges on the DW2. This is figure 6 of the recent benchmark paper. Wall clock times include the time for programming, cooling, annealing, readout and communication. Gauges refer to different encodings of a problem instance. (Only plot A and B are relevant to the settling of my bet).

Now, if you don’t click through to Scott’s actual blog post. you may take away that he actually changed his stance. But of course he hasn’t. You can look at the above picture and think the glass is ninety percent empty or you could proclaim it is ten percent full.

The latter may sound hopelessly optimistic, but let’s contemplate what we are actually comparing.  Current computer chips are the end product of half a century highly focused R&D, with billions of dollars poured into developing them. Yet, we know we are getting to the end of the line of Moore’s law. Leak currents already are a real problem, and the writing is on the wall that we are getting ever closer to the point where the current technology will no longer allow for tighter chip structures.

On the other hand, the D-Wave chip doesn’t use transistors. It is an entirety different approach to computing, as profoundly different as the analog computers of yore.

The integration density of a chip is usually classified by the length of the silicon channel between the source and drain terminals in its field effect transistors (e.g. 25nm). This measure obviously doesn’t apply to D-Wave, but the quantum chip integration density isn’t even close to that. With the ridiculously low number of about 500 qubits on D-Wave’s chip, which was developed on a shoestring budget when compared to the likes of Intel or IBM, the machine still manages to hold its own against a modern CPU.

Yes, this is not a universal gate-based quantum computer, and the NSA won’t warm up to it because it cannot implement Shore’s algorithm, nor is there a compelling theoretical reason that you can achieve a quantum speed-up with this architecture. What it is, though, is a completely new way to do practical computing using circuit structures that leave plenty of room at the bottom.  In a sense, it is resetting the clock to when Feynman delivered his famous and prophetic talk on the potentials of miniaturization. Which is why from a practical standpoint I fully expect to see a D-Wave chip eventually unambiguously outperform a classical CPU.

On the other hand, if you look at this through the prism of complexity theory none of this matters, only proof of actual quantum speed-up does.

Scott compares the quantum computing skirmishes he entertains with D-Wave to the American Civil war.

If the D-Wave debate were the American Civil War, then my role would be that of the frothy-mouthed abolitionist pamphleteer

Although clearly tongue in cheek, this comparison still doesn’t sit well with me.  Fortunately, in this war, nobody will lose life or limb. The worst that could happen is a bruised ego, yet if we have to stick to this metaphor, I don’t see this as Gettysburg 1863 but the town of Sandwitch 1812.

Much more will be written on this paper. Once it has fully passed peer review and been published, I will also be finally able to reveal my betting partner. But in the meantime there a Google+ meeting scheduled that will allow for more discussion (h/t Mike).

Update

Without careful reading of the paper a casual observer may come away with the impression that this test essentially just pitted hardware against hardware. Nothing could be further from the truth, some considerable effort had to go into constructing impressive classical algorithms to beat the D-Wave machine on its own turf.  This Google Quantum AI lab post elaborates on this (h/t Robert R. Tucci).

Update 2

D-Wave’s Geordie Rose weighs in.

 

 

 

 

 

 

Quantum Computing NSA Round-Up

Chinawall_Red_Wave

Ever since the Edward Snowden-provided news broke that the NSA spent in excess of $100M on quantum computing I meant to address this in a blog post. But Robert R. Tucci beat me to it and has some very interesting speculations to add.

He also picked up on this quantum computing article in the South China Morning Post reporting on research efforts in mainland China.  Unfortunately, but unsurprisingly, it is light on technical details. Apparently China follows a shotgun approach of funding all sorts of quantum computing research. The race truly seems to be on.

Not only is China investing in a High Magnetic Field Laboratory to rival the work conducted at the US based NHMFL, but there is also Prof. Wang Haohua’s efforts based on superconducting circuitry.

Interestingly, the latter may very well follow a script that Geordie Rose was speculating on when I asked him where he thinks competition in the hardware space may one day originate from.  The smart move for an enterprising Chinese researcher would be to take the government’s seed money, and focus on retracing a technological path that has already proven to be commercially successful.  This won’t get the government an implementation of Shor’s algorithm any faster, but adiabatic factorization may be a consolation prize.  After all, that one was already made in China.

But do the NSA revelations really change anything?  Hopefully it will add some fuel to the research efforts, but at this point this will be the only effect.  The NSA has many conventional ways to listen in on the mostly unsecured Internet traffic.  On the other hand RSA with a sufficiently long key length is still safe.  For now if customers were to switch to email that is hardened in this way it’ll certainly make the snoops’ job significantly harder.

Blog Memory Hole Rescue – The Fun is Real

It seems that work and life is conspiring to leave me no time to finish my write-up on my General Fusion visit.  Started it weeks ago but still I am not ready to hit the publish button on this piece.

memory_hole

In the meantime I highly recommend the following blog that I came across.  It covers very similar topics than the ones here, and also shares a similar outlook.  For instance, this article beautifully sums up why I never warmed up to Everett’s Multiverse interpretation (although I have to admit reading Julian Barbour’s End of Time softened my stance a bit – more on this later).

The ‘Fun Is Real’ blog is a cornucopia of good physics writing and should provide many hours of thought-provoking reading material to bridge over the dearth of my current posting schedule.

On a side note, given that this goes to the core of the topic I write about on this blog, the following news should not go unmentioned:  Australian researchers reportedly have created a cluster state of 10,000 entangled photonic qubits (h/t Raptis T.).

This is magnitudes more than has been previously reported. Now if they were to manage to get some quantum gates applied to them we’d be getting somewhere.

The D-Wave Phenomenon

This is my first installment of the write-up on my recent visit to D-Wave in Burnaby, BC.

No matter where you stand on the merits of D-Wave technology, there is no doubt they have already made computing history. Transistors have been the sole basis for our rapidly improving information technology since the last vacuum tube computer was sold in the early sixties.  That is, until D-Wave started to ship their first system. Having won business from the likes of NASA and Google, this company is now playing in a different league. D‑Wave now gets to present at high profile events such as the HPC IDP conference,  and I strongly suspect that they caught the eye of the bigger players in this market.

The entire multi-billion dollar IT industry is attached at the hip to the existing computing paradigm, and abhors cannibalize existing revenue streams. This is why I am quite certain that as I write this, SWOT assessments and talking-points on D-Wave are being typed up in some nondescript Fortune 500 office buildings (relying on corporate research papers like this to give them substance).  After all, ignoring them is no longer an option.  Large companies like to milk cash cows as long as possible.  An innovative powerhouse like IBM, for instance, often follows the pattern to invest in R&D up to productization, but they are prone to holding back even superior technology if it may jeopardize existing lines of business. Rather, they just wait until a competitor forces their hand, and then they rely on their size and market depth, in combination with their quietly acquired IP, to squash or co-opt them. They excel at this game and seldom lose it (it took somebody as nimble and ruthless as Bill Gates to beat them once).

This challenging competitive landscape weighed on my mind when I recently had an opportunity to sit down with D-Wave founder and CTO Geordie Rose, and so our talk first focused on D-Wave’s competitive position.  I expected that patent protection and technological barriers of entry would dominate this part of our conversation, and was very surprised about Geordie’s stance, which certainly defied conventional thinking.
 

geordie_in_box
Geordie Rose founder and CTO of D-Wave in one of the Tardis-sized boxes that host his quantum chip. The interior is cooled close to absolute zero when in operation. If you subscribe to the multiverse interpretation of quantum mechanics one may argue that it then will in fact be bigger on the inside. After all, the Hilbert space is a big place.

While he acknowledged the usefulness of the over 100 patents that D-Wave holds,  he only considers them to be effectively enforceable in geographies like North America. Overall, he does not consider them an effective edge to keep out competition, but was rather sanguine that the fate of any computing hardware is to eventually become commoditized. He asserted that the academic community misjudged how hard it would be produce a device like the D-Wave machine.  Now that D-Wave has paved the way, he considers a cloning and reverse engineering of this technology to be fairly straightforward.  One possible scenario would be a government funded QC effort in another geography to incubate this new kind of information processing.  In the latter case, patent litigation will be expensive, and may ultimately be futile.  Yet, he doesn’t expect these kind of competitive efforts unless D-Wave’s technology has further matured and proven its viability in the market place.

I submitted that the academic push-back that spreads some FUD with regards to their capabilities, may actually help in this regard. This prompted a short exchange on the disconnect with some of the academic QC community.  D-Wave will continue to make it’s case with additional publication to demonstrate entanglement and the true quantum nature of their processor.  But ultimately this is a side-show, the research focus is customer driven and to the extent that this means deep learning (e.g. for pattern recognition) the use case of the D-Wave chip is evolving.  Rather than only using it as an optimization engine, Geordie explained how multiple solution runs can be used to sample the problem space of a learning problem and facilitate more robust learning.

It is the speed of customer driven innovation that Geordie relies on giving D-Wave a sustainable edge, and ultimately he expects that software and services for his platform will prove to be the key to a sustainable business.  The current preferred mode of customer engagement is what D-Wave calls a deep partnership, i.e. working in very close collaboration with the customer’s staff. Yet, as the customer base grows, more management challenges appear, since clear lines have to be drawn to mark where the customer’s intellectual property ends and D-Wave’s begins. The company has to be able to re-sell solutions tied to its architecture.

D-Wave experiences some typical growing pains of a successful organization, and some unique high tech challenges in managing growth. How Geordie envisions tackling those will be the subject of the next installment.

Septimana Mirabilis – Major Quantum Information Technology Breakthroughs

Update 4: The award for the funniest photo commentary on this imbroglio goes to Robert Tucci.

Update 3: Congratulations to D-Wave for their  recent sale of the D-Wave Two machine to  the non-profit Space Research Association  – to be used collaboratively by Google and NASA. (h/t Geordie Rose)

Update 2: Scott Aaronson finally weighs in, and as Robert Tucci predicted in the comments, he resumed his sceptical stance.

Update: Link to Catherine McGeoch and Cong Wang’s paper.

D-Wave Cooper-pair states in real space. The company that derived it's name from this now makes some major waves of its own.
D-Wave Cooper-pair states in real space. Now the company that derived its name from this wavefunction makes some waves of its own.

What a week for Quantum Information Science. D-Wave made some major news when the first peer reviewed paper to conclusively demonstrate that their machine can drastically outperform conventional hardware was recently announced.  It’s hardly a contest.  For the class of optimization problems that the D-Wave machines are designed for, the algorithms executed on the conventional chip didn’t even come close. The D-Wave machine solved some of the tested problems about 3600 times faster than the best conventional algorithm. (I’ll leave it to gizmodo to not mince words).

Apparently, my back of the envelope calculation from last year, that was based on the D-Wave One performance of a brute force calculation of Ramsey numbers, wasn’t completely off.  Back then I calculated that the 128 qubit chip performed at the level of about 300 Intel i7 Hex CPU cores (the current test ran on the next generation 512 qubit chip). So, I am now quite confident in my ongoing bet.

If conventional hardware requires thousands of conventional cores to beat the current D-Wave machine, then the company has certainly entered a stage where its offering becomes attractive to a wider market.  Of course, other factors will weigh in when considering total cost of ownership.  The biggest hurdle in this regard will be software, as to date any problem you want to tackle the D-Wave way requires dedicated coding for this machine.  At first these skills will be rare and expansive to procure. On the other hand, there are other cost factors working in D-Wave’s favor:  Although I haven’t seen power consumption numbers, the adiabatic nature of the chip’s architecture suggests that it will require far less wattage than a massive server farm or conventional super-computer.  Ironically, while the latter operate at normal ambient temperature they will always require far more cooling effort to keep them at this temp than the D-Wave chips in their deep freeze vacuum.

That the current trajectory of our supercomputer power consumption is on an unsustainable path should be obvious by simply glancing at this chart.

Despite the efforts there are hard efficiency limits for conventional CMOS transistors. (for the original pingdom.com article click image)

D-Wave matures just at the right time to drive a paradigm change, and I hope they will pursue this opportunity aggressively.

But wait, there’s more.  This week was remarkable in unveiling yet another major breakthrough for Quantum Information technology: At Los Alamos National Labs, an Internet scalable quantum cryptographic network has been operating without a hitch for the last two years.  Now there’s an example for research that will “directly benefit the American people” (something that should please Congressman Lamar Smith, the current controversial chairman of the House of Representatives Committee on Science).

Why it took two years for this news to be published is anybody’s guess. Did somebody just flip a switch and then forget about it? Probably more likely that this research has been considered classified for some time.

Certainly this also suggests a technology who’s time has come.  Governmental and enterprise networks have been compromised at increasing rates, even causing inflammatory talk of ongoing cyber warfare. And while there have been commercial quantum encryption devices on the market for quite some time now, these have been limited to point to point connections.  Having a protocol that allows the seamless integration of quantum cryptography into the existing network stack raises this to an entirely different level.  This is of course no panacea against security breaches, and has been criticized as providing superfluous security illusions, since the social engineering attacks clearly demonstrate the human users as the weakest link. Nevertheless, I maintain that it has the potential to relegate brute force attacks to history’s dustbin.

The new quantum protocol uses a typical “hub-and-spoke” topology as illustrated in the following figure and explained in more detail in the original paper.

Network-Centric Quantum Communications with Application to Critical  Infrastructure Protection Topology
The NQC topology maps well onto those widely encountered in optical fiber networks, and permits a hierarchical trust architecture for a “hub” to act as the trusted authority (TA, “Trent”) to facilitate quantum authenticated key exchange.

Another key aspect is the quantum resource employed in the network:

The photonic phase-based qubits typically used in optical fiber QC require interferometric stability and inevitably necessitate bulky and expensive hardware. Instead, for NQC we use polarization qubits, allowing the QC transmitters – referred to as QKarDs – to be miniaturized and fabricated using integrated photonics methods [12]. This opens the door to a manufacturing process with its attendant economy of scale, and ultimately much lower-cost QC hardware.

It will be interesting to observe how quickly this technology will be commercialized, and if the US export restriction on strong cryptography will hamper the global adoption.

So You Want to Learn About Quantum Computing?

“Students will learn by inhabiting an alternate history where Alan Turing and Richard Feynman meet during World War II and must invent quantum computers to defeat Nazi Germany. As a final project, they will get to program a D-Wave One machine and interpret its results.”

If you are based in Seattle then you want to keep an eye out for when Paul Pham next teaches the Quantum Computing for Beginners course that follows the exciting narrative outlined above.

For everybody else, there is EdX‘s CS191x Quantum Mechanics and Quantum Computation course.  I very much hope this course will a be a regular offering.  Although it lacks the unique dramatic arche of P.Pham’s story line this course is nevertheless thoroughly enjoyable.

When I signed up for this course, I didn’t know what to expect.  Mostly, I decided to check it out because I was curious to see how the subject would be taught, and because I wanted to experience how well a web-based platform could support academic teaching.

This course fell during an extremely busy time, not only because of a large professional work load, but also because the deteriorating health of my father required me to fly twice from Toronto to my parents in Germany.  Despite this, the time required for this course proved to be entirely manageable.  If you have an advanced degree in math, physics or engineering, and want to learn about Quantum Computing, you shouldn’t shy away from taking this course as long as you have an hour to spare each week.  It helps that you can accelerate the video lectures to 1 1/2 normal speed (although this made Prof. Umesh Vazirani sound a bit like he inhaled helium).

Prof. Vazirani is a very competent presenter, and you can tell that a lot of thought went into how to approach the subject, i.e. how to ease into the strangeness of Quantum Mechanics for those who are new to it. I was suspicious of the claim made at the outset, that the required mathematics would be introduced and developed as needed during the course, but it seems to me that this was executed quite well. (Having been already familiar with the required math, I don’t really know if it’ll work for somebody completely new to it, but it seems to me that indeed the only pre-requisite required was a familiarity with linear algebra).

It is interesting to see discussions posted by individuals who took the course and were apparently subjected to QM for the first time.  One such thread started this way:

“I got 100. It was really a fun. Did I understand anything? I would say I understood nothing.”

To me this illuminates the fact that you simply cannot avoid the discussion of the interpretation of quantum mechanics.  Obviously this subject is still very contentious, and Prof. Vazirani touched on it when discussing the Bell inequalities in a very concise and easy to understand manner.  Yet, I think judging from the confusion of these ‘straight A’ students there needs to be more of it.  It is not enough to assert that Einstein probably would have reconsidered his stance if he knew about these results.  Yes, he would have given up on a conventional local hidden variable approach, but I am quite certain his preference would have then shifted to finding a topological non-local field theory.

Of course, there is only so much that can be covered given the course’s duration. Other aspects there were missing: Quantum Error Correction, Topological and Adiabatic Quantum Computing and especially Quantum Annealing.  The latter was probably the most glaring omission, since this is the only technology in this space that is already commercially available.

Generally, I found that everything that was covered, was covered very well.  For instance, if you ever wondered how exactly Grover’s and Shor’s algorithms work, you will have learned this after taking the course. I especially found the homework assignments wonderful brain teasers that helped me take my mind off of more worrisome issues at hand.  I think I will miss them. They were comprised of well thought out exercises, and as with any science course, it is really the exercises that help you understand and learn the material.

On the other hand, the assignments and exams also highlighted the strengths and weaknesses of the technology underlying the courseware.  Generally, entering formulas worked fine, but sometimes the solver was acting up and it wasn’t always entirely clear why (i.e. how many digits were required when giving a numerical answer, or certain algebraically equivalent terms were not recognized properly).  While this presented the occasional obstacle, on the upside you get the immediate gratification of instance feedback and a very nice progress tracking that allows you to see exactly how you are doing. The following is a screenshot of my final tally. The final fell during a week in which I was especially hard pressed for time, and so I slacked off, just guesstimating the last couple of answers (with mixed results).  In comparison to a conventional class, knowing exactly when you have already achieved a passing score via the tracking graph makes this a risk- and stress-free strategy.

Screen Shot 2013-04-27 at 11.56.31 AMA common criticism of online learning in comparison to the established ways of doing things is the missing classroom experience and interaction with the professor and teaching staff.  To counter this, discussion boards were linked to all assignments, and discussion of the taught material was encouraged.  Unfortunately, since my time was at a premium I couldn’t participate as much as I would have liked, but I was positively surprised with how responsive the teaching assistants answered questions that were put to them (even over the weekends).

This is all the more impressive given the numbers of students that were enrolled in this course:

The geographic reach was no less impressive:

Having being sceptical going into this, I’ve since become a convert.  Just as Khan Academy is revolutionizing the K12 education, EdX and similar platforms like Cousera represent the future for academic teaching.