Category Archives: Quantum Mechanics

How many social networks do you need?

The proliferation of social networks seems unstoppable now. Even the big ones you can no longer count on one hand: Facebook, LinkedIn, GooglePlus, Twitter, Instagram, Tumblr, Pinterest, Snapchat – I am so uncool I didn’t even know about the latter until very recently. It seems that there has to be a natural saturation point with diminishing marginal return of signing up to yet another one, but apparently we are still far from it.

Recently via LinkedIn I learned about a targeted social network that I happily signed up for, which is quite against my character (i.e. I still don’t have a Facebook account).

iQEi_logo
Free to join and no strings attached. (This targeted social network is not motivated by a desire to monetize your social graph).

The aptly named International Quantum Exchange for Innovation is a social network set up by DK Matai with the express purpose of bringing together people of all walks of life anywhere on this globe who are interested in the next wave of the coming Quantum Technology revolution. If you are as much interested in this as I am, then joining this UN of Quantum Technology, as DK puts it, is a no-brainer.

The term ‘revolution’ is often carelessly thrown around, but in this case I think, when it comes to the new wave of quantum technologies, it is more than justified. After all, the first wave of QM driven technologies powered the second leg of the  Industrial Revolution. It started with a bang, in the worst possible manner, when the first nuclear bomb ignited, but the new insights gained led to a plethora of new high tech products.

Quantum physics was instrumental in everything from solar cells, to lasers, to medical imaging (e.g. MRI) and of course, first and foremost, the transistor. As computers became more powerful, Quantum Chemistry coalesced into an actual field, feeding on the ever increasing computational power. Yet Moore’s law proved hardly adequate for its insatiable appetite for the compute cycles required by the underlying quantum numerics.

During Richard Feynman’s (too short) life span, he was involved in the military as well as civilian application of quantum mechanics, and his famous “there is plenty of room at the bottom” talk can be read as a programmatic outline of the first Quantum Technology revolution.  This QT 1.0 wave has almost run its course. We made our way to the bottom, but there we encountered entirely new possibilities by exploiting the essential, counter-intuitive non-localities of quantum mechanics.  This takes it to the next step, and again Information Technology is at the fore-front. It is a testament to Feynman’s brilliance that he anticipated QT 2.0 as well, when suggesting a quantum simulator for the first time, much along the lines of what D-Wave built.

It is apt and promising that the new wave of quantum technology does not start with a destructive big bang, but an intriguing and controversial black box.

D-Wave_2001

 

Quantum Computing Road Map

No, we are not there yet, but we are working on it.Qubit spin states in diamond defects don’t last forever, but they can last outstandingly long even at room temperature (measured in microseconds which is a long time when it comes to computing).

So this is yet another interesting system added to the list of candidates for potential QC hardware.

Nevertheless, when it comes to the realization of scalable quantum computers, qubits decoherence time may very well be eclipsed by the importance of another time span: 20 years, the length at which patents are valid (in the US this can include software algorithms).

With D-Wave and Google leading the way, we may be getting there faster than most industry experts predicted. Certainly the odds are very high that it won’t take another two decades for useable universal QC machines to be built.

But how do we get to the point of bootstrapping a new quantum technology industry? DK Matai addressed this in a recent blog post, and identified five key questions, which I attempt to address below (I took the liberty of slightly abbreviating the questions, please check at the link for the unabridged version).

The challenges DK laid out will require much more than a blog post (or a LinkedIn comment that I recycled here), especially since his view is wider than only Quantum Information science. That is why the following thoughts are by no means comprehensive answers, and very much incomplete, but they may provide a starting point.

1. How do we prioritise the commercialisation of critical Quantum Technology 2.0 components, networks and entire systems both nationally and internationally?

The prioritization should be based on the disruptive potential: Take quantum cryptography versus quantum computing for example. Quantum encryption could stamp out fraud that exploits some technical weaknesses, but it won’t address the more dominant social engineering deceptions. On the upside it will also facilitate iron clad cryptocurrencies. Yet, if Feynman’s vision of the universal quantum simulator comes to fruition, we will be able to tackle collective quantum dynamics that are computationally intractable with conventional computers. This encompasses everything from simulating high temperature superconductivity to complex (bio-)chemical dynamics. ETH’s Matthias Troyer gave an excellent overview over these killer-apps for quantum computing in his recent Google talk, I especially like his example of nitrogen fixation. Nature manages to accomplish this with minimal energy expenditure in some bacteria, but industrially we only have the century old Haber-Bosch process, which in modern plants still results in 1/2 ton of CO2 for each ton of NH3. If we could simulate and understand the chemical pathway that these bacteria follow we could eliminate one of the major industrial sources of carbon dioxide.

2. Which financial, technological, scientific, industrial and infrastructure partners are the ideal co-creators to invent, to innovate and to deploy new Quantum technologies on a medium to large scale around the world? 

This will vary drastically by technology. To pick a basic example, a quantum clock per se is just a better clock, but put it into a Galileo/GPS satellite and the drastic improvement in timekeeping will immediately translate to a higher location triangulation accuracy, as well as allow for a better mapping of the earth’s gravitational field/mass distribution.

3. What is the process to prioritise investment, marketing and sales in Quantum Technologies to create the billion dollar “killer apps”?

As sketched out above, the real price to me is universal quantum computation/simulation. Considerable efforts have to go into building such machines, but that doesn’t mean that you cannot start to already develop software for them. Any coding for new quantum platforms, even if they are already here (as in the case of the D-Wave 2) will involve emulators on classical hardware, because you want to debug and proof your code before submitting it to the more expansive quantum resource. In my mind building such an environment in a collaborative fashion to showcase and develop quantum algorithms should be the first step. To me this appears feasible within an accelerated timescale (months rather than years). I think such an effort is critical to offset the closed sourced and tightly license controlled approach, that for instance Microsoft is following with its development of the LIQUi|> platform.

4. How do the government agencies, funding Quantum Tech 2.0 Research and Development in the hundreds of millions each year, see further light so that funding can be directed to specific commercial goals with specific commercial end users in mind?

This to me seems to be the biggest challenge. The amount of research papers produced in this field is enormous. Much of it is computational theory. While the theory has its merits, I think the governmental funding should try to emphasize programs that have a clearly defined agenda towards ambitious yet attainable goals. Research that will result in actual hardware and/or commercially applicable software implementations (e.g. the UCSB Martinis agenda). Yet, governments shouldn’t be in the position to pick a winning design, as was inadvertently done for fusion research where ITER’s funding requirements are now crowding out all other approaches. The latter is a template for how not to go about it.

5. How to create an International Quantum Tech 2.0 Super Exchange that brings together all the global centres of excellence, as well as all the relevant financiers, customers and commercial partners to create Quantum “Killer Apps”?

On a grassroots level I think open source initiatives (e.g. a LIQUiD alternative) could become catalysts to bring academic excellence centers and commercial players into alignment. This at least is my impression based on conversations with several people involved in the commercial and academic realm. On the other hand, as with any open source products, commercialization won’t be easy, yet this may be less of a concern in this emerging industry, as the IP will be in the quantum algorithms, and they will most likely be executed with quantum resources tied to a SaaS offering.

 

The Unintentional Obsfuscation of Physics

Sometimes it only takes one person’s untimely demise to change history. There’s an entire genre of literature that explores these possibilities, typically involving the biggest baddies of human history. The following video is an artful example that makes this point rather succinctly – while also leaving me profoundly uncomfortable (after all, it does involve the death of a child).

I am not aware of many examples of exploring alternative histories with regards to science, and by that I mean in more detail than what steampunk has to offer, although William Gibson and Bruce Sterling do a pretty good job of imagining a world in which Charles Babbage succeeded in introducing a mechanical computer to the world in their book “The Difference Engine“.  The subject matter is certainly a worthwhile topic for another post , especially when contrasted with the challenges now to go beyond the Turing machine by getting Quantum Computing to the market. (h/t vznvzn)

william
William Kingdon Clifford (1845 – 1879). Had he lived longer physics would be taught differently.

The untimely death I am contemplating here is that of William Kingdon Clifford. If you are not immersed in physics and math, you have probably never heard his name, because we live in a world where he died young.

That meant it fell to Gibbs and Heaviside to clean up the Maxwell equations, which gave us the insufferable cross-product that confused leagues of students by requiring them to distinguish between polar and axial vectors.  It also meant that complex function theory got stuck in two dimensions, and that group theory was developed without the obvious geometric connection. Which in turn, once this approach started to take over, provoked older physicists, such as Schrödinger, to coin the term “Gruppenpest” (group pestilence). It also created a false symmetry between the electric and magnetic fields, motivating the quest for the ever elusive magnetic monopol. Last but not least, it led to the confused notion that spin is an intrinsically quantum mechanical property, something that is still taught in universities across the globe to this day.

It’s impossible to do Geometric Algebra (GA) justice in one short blog post, but David Hestenes managed to do so in a fairly concise and highly readable paper, the 2002 Oersted Medal Lecture.

It is hard to overstate the profound effect this paper had on me.  The only thing it compares to is when I first learned of Euler’s formula many years ago in my first physics semester.  And the similarities are striking, not only due to the power of bringing together seemingly disparate areas of mathematics by putting them into a geometric context. In the latter case, the key is the imaginary unit, which was originally introduced to solve for negative square roots, and thus allows for the fundamental theorem of algebra. In fact, it turns out that complex numbers can be neatly embedded into geometric algebra and are isomorphic to the 2d GA case. Also, Quaternion are part of the 3d geometric algebra and have a similarly satisfying geometric interpretation.

All this is accomplished by introducing a higher level concept of vector.  For instance, rather than using a cross product, an outer product is defined that creates a bivector that can be thought of as a directed plane segment.

Hestenes makes a convincing case that geometric algebra should be incorporated into every physics curriculum. He wrote some excellent textbooks on the subject, and thankfully, numerous other authors have picked up the mantle (outstanding is John W. Arthur’s take on electrodynamics and Chris Doran’s ambitious and extensive treatment).

The advantages of geometric algebra are so glaring and the concepts so natural that one has to wonder why it took a century to be rediscovered.  John Snygg puts it best in the preface to his textbook on differential geometry:

Although Clifford was recognized worldwide as one of England’s most distinguished mathematicians, he chose to have the first paper published in what must have been a very obscure journal at the time. Quite possibly it was a gesture of support for the efforts of James Joseph Sylvester to establish the first American graduate program in mathematics at Johns Hopkins University. As part of his endeavors, Sylvester founded the American Journal of Mathematics and Clifford’s first paper on what is now known as Clifford algebra appeared in the very first volume of that journal.

The second paper was published after his death in unfinished form as part of his collected papers. Both of these papers were ignored and soon forgotten. As late as 1923, math historian David Eugene Smith discussed Clifford’s achievements without mentioning “geometric algebra” (Smith, David Eugene 1923). In 1928, P.A.M. Dirac reinvented Clifford algebra to formulate his equation for the electron. This equation enabled him to predict the discovery of the positron in 1931. (…)

Had Clifford lived longer, “geometric algebra” would probably have become mainstream mathematics near the beginning of the twentieth century. In the decades following Clifford’s death, a battle broke out between those who wanted to use quaternions to do physics and geometry and those who wanted to use vectors. Quaternions were superior for dealing with rotations, but they are useless in dimensions higher than three or four without grafting on some extra structure.

Eventually vectors won out. Since the structure of both quaternions and vectors are contained in the formalism of Clifford algebra, the debate would have taken a different direction had Clifford lived longer. While alive, Clifford was an articulate spokesman and his writing for popular consumption still gets published from time to time. Had Clifford
participated in the quaternion–vector debate, “geometric algebra” would have received more serious consideration.

D-Wave Withdrawal Relief

AAFor anybody needing an immediate dose of D-Wave news, Wired has this long, well researched article (Robert R. Tucci summarized it in visual form on his blog). It strikes a pretty objective tone, yet I find the uncritical acceptance of Scott Aaronson’s definition of quantum productivity a bit odd.  As a theorist, Scott is only interested in quantum speed-up. That kind of tunnel vision is not inappropriate for his line of work, just an occupational hazard that goes with the job, but it doesn’t make for a complete picture.

Other than that, the article only has some typical minor problems with QM.

At this point, you don’t really expect a journalist to get across how gate model quantum algorithms work, and the article actually does this better than most. But the following bit is rather revealing; The writer, Clive Thompson, describes visually inspecting the D-Wave chip:

Peering in closely, I can just make out the chips, each about 3 millimeters square. The niobium wire for each qubit is only 2 microns wide, but it’s 700 microns long. If you squint very closely you can spot one: a piece of the quantum world, visible to the naked eye.

SQUIDs for magnetometers don’t have to be very small. (Photo approximately to scale – as indicated by the handwriting on this lab sample). This is because for this application you want to measure the magnetic flux encompassed by the loop.

Innocuous enough quote, and most physicists wouldn’t find anything wrong with it either, but therein lies the rub. SQUIDs can be fairly large (see photo to the right).

Any superconducting coil can harbour a coherent quantum state, and they can be huge.

The idea that quantum mechanics somehow only governs the microcosm has been with us from its inception, because that’s what was experimentally accessible at the time i.e. atomic spectra.  But it is a completely outdated notion.

This is something I only fully came to grasp after reading Carvar Maed’s brilliant little book on Collective Electrodynamics. In it, he makes a very compelling case that we are due for another paradigm change.  To me, the latter means dusting off some of Schrödinger’s original wave mechanics ideas. If we were to describe a simple quantum algorithm using that picture, there’s a much better chance to give non-physicists an idea of how these computation schemes work.

 

Blog Memory Hole Rescue – The Fun is Real

It seems that work and life is conspiring to leave me no time to finish my write-up on my General Fusion visit.  Started it weeks ago but still I am not ready to hit the publish button on this piece.

memory_hole

In the meantime I highly recommend the following blog that I came across.  It covers very similar topics than the ones here, and also shares a similar outlook.  For instance, this article beautifully sums up why I never warmed up to Everett’s Multiverse interpretation (although I have to admit reading Julian Barbour’s End of Time softened my stance a bit – more on this later).

The ‘Fun Is Real’ blog is a cornucopia of good physics writing and should provide many hours of thought-provoking reading material to bridge over the dearth of my current posting schedule.

On a side note, given that this goes to the core of the topic I write about on this blog, the following news should not go unmentioned:  Australian researchers reportedly have created a cluster state of 10,000 entangled photonic qubits (h/t Raptis T.).

This is magnitudes more than has been previously reported. Now if they were to manage to get some quantum gates applied to them we’d be getting somewhere.

Blog Round-Up

Lots of travel last week delayed the second installment on my D-Wave visit write-up, but I came across some worthy re-blog material to bridge the gap.

inholeI am usually very hard on poorly written popular science articles, which is all the more reason to point to some outstanding material in this area. I found that one writer, Brian Dodson, at the Gizmag site usually delivers excellent content. Due to his science background, he brings an unusual depth of understanding to his writing. His latest pieces are on General Relativity compatible alternatives to dark energy and a theoretical Quantum black hole study that puts the gravity loop approach to some good use. The latter is a good example as to why I am much more inclined to Loop Quantum Gravity rather than the ephemeral String theory, as the former at least delivers some predictions.

Another constant topic of this blog is the unsatisfying situation with regards to the foundational interpretations of Quantum Mechanics.  Lack of progress in this area can in no small measure be attributed to the ‘Shut up and calculate’ doctrine, a famous  quip attributed to Feynman that has since been enshrined as an almost iron rule.

To get a taste for how prohibitively this attitude permeates the physics community, this arxiv paper/rant is a must read. From the abstract:

If you have a restless intellect, it is very likely that you have played at some point with the idea of investigating the meaning and conceptual foundations of quantum mechanics. It is also probable (albeit not certain) that your intentions have been stopped in their tracks by an encounter with some version of the “Shut up and calculate!” command. You may have heard that everything is already understood. That understanding is not your job. Or, if it is, it is either impossible or very difficult. Maybe somebody explained to you that physics is concerned with “hows” and not with “whys”; that whys are the business of “philosophy” -you know, that dirty word. That what you call “understanding” is just being Newtonian; which of course you cannot ask quantum mechanics to be. Perhaps they also complemented this useful advice with some norms: The important thing a theory must do is predict; a theory must only talk about measurable quantities. It may also be the case that you almost asked “OK, and why is that?”, but you finally bit your tongue. If you persisted in your intentions and the debate got a little heated up, it is even possible that it was suggested that you suffered of some type of moral or epistemic weakness that tends to disappear as you grow up. Maybe you received some job advice such as “Don’t work in that if you ever want to own a house”.

At least if this bog post is any indication the times seem to be changing and becoming more permissive.

The Other Kind of Cold Fusion

Cygnus_X-1
Nature clearly favours hot fusion no matter how cold the light. The cold glow in this image stems from a Blue Giant that is believed to orbit a black hole in the Cygnus X-1 system.

If you lived through the eighties there are certain things you could not miss, and since this is a science blog I am of course not referring to fashion aberrations, like mullets and shoulder pads, but rather to what is widely regarded as one of the most notorious science scandals to date: Fleischmann and Pons Cold Fusion, the claim of tapping the ultimate energy source within a simple electrochemical cell.

driver_license_photo
This blog’s author’s photo proves that he lived through the eighties. Since this driver’s licence picture was taken the same year as the Fleischmann and Pons disaster, the half smile was all that I could muster.

For a short time it felt like humanity’s prayers to deliver us from fossil fuel had been answered (at least to those who believe in that sort of thing). Of course, paying the ever increasing price at the gas pump is a constant (painful) reminder that this euphoric moment at the end of eighties was but a short lived aberration. But back then it felt so real. After all, there already existed a well-known process that allowed for nuclear fusion at room temperature, catalyzed by the enigmatic muons. One of the first scientific articles that I read in English was on that phenomenon, and it was published just a couple of years earlier. So initial speculations abounded, that maybe muons in the cosmic background radiation could somehow help trigger the reported reaction (although there was no explanation given as to how this low muon flux density could possibly accomplish this). While my fringe blog focuses on the intrepid researchers who, despite the enormous blow back, still work on Fleischman Pons-style research, this post is about the former, the oft forgotten muon-catalyzed fusion.

It is a beautiful nuclear reaction, highlighting one of the most basic peculiarities of quantum mechanics: Quantum Tunnelling and Heisenberg uncertainty principle. Both of these are direct consequences of the manifest wave properties of matter at this scale. The former allows matter to seep into what should be impenetrable barriers, and the latter describes how a bound point particle is always “smeared out” over a volume – as if points are an abstraction that nature abhors. Last but not least, it showcases the mysterious muon, a particle that seems to be identical to electrons in every way but the mass and stability (about 200 times more mass and a pretty long half life of about 2 μs). Because it behaves just like a heavier twin of the electron, it can substitute the latter in atoms and molecules.

The Heisenberg uncertainty principle states that the product of momentum (mass times velocity) and position ‘certainty’ has a lower bound. Usually the term uncertainty is simply interpreted probabilistically in terms of the deviation of the expectation value. But this view, while formally entirely correct, obstructs the very real physical implication of trying to squeeze a particle into a small space, because the momentum uncertainty then becomes a very real physical effect of quantum matter. The particle’s velocity distribution will become ever broader, forcing the matter outwards and creating an orbital ‘cloud’ (e.g. specifically the spherical hydrogen s-orbital). There is really no good analogy in our everyday experience, they all sound silly: My version is that of a slippery soap in a spherical sink, the harder you try to grasp it the more powerful you send it flying. If you were to map all trajectories of the soap over time, you will find that on average it was anywhere in the sink with the probability decreasing towards the rim (that is unless you squeeze it so hard that it acquires enough speed to jump out of the sink – I guess that would be an analog to ionization). In the atomic and chemical realm, on the other hand, the very concept of a trajectory doesn’t hold up (unless you are dealing with Rydberg atoms). You may as well think of electron orbitals as charge distributions (as this is exactly how they behave in the chemical domain).

Because the momentum rather then the velocity enters into the equation, the orbitals for a heavier version of the electron will be considerably smaller, i.e. about 2oo times smaller for the muon, as this is the factor by which the particle’s velocity can be reduced in order to still get the same momentum. So muonic hydrogen is much smaller than the electron version. That’s already all that is needed to get fusion going, because if two heavy hydrogen nucleons are bound in a muonic μH2 molecule they are far too close for comfort. Usually the repellent force of the electrostatic Coulomb potential should be enough to keep them apart, but the quantum tunnel effect allows them to penetrate the ‘forbidden’ region. And at this distance, the probability that both nucleons occupy the same space becomes large enough to get measurable incidents of nuclear fusion i.e. μH→ μHe.

The hydrogen used in the experimental realization is not the usual kind, but as with other fusion realizations, the heavier hydrogen isotopes deuterium and tritium are required, and since there is only one muon in the mix the d-t hydrogen is ionized. so that the equation looks more like this: (d-μ-t)+ → n + α (with the n indicating a fast neutron and the α a Helium-4 nucleus.)

The latter causes a lot of trouble as the muon ‘sticks’ to this alpha particle with a 1% chance (making it a muonic helium ion). If this happens, this muon is no longer available to catalyze more fusion events. This, in combination with the limited life time of the muons, and the ‘set-up’ required by the muons to bind to the hydrogen isotopes, is the limiting factor of this reaction.

Without a constant massive resupply of muons the reaction tempers off quickly. Despite decades of research this problem could never be surmounted. It takes pions to make muons, and the former are only produced in high energy particle collisions. This costs significantly more energy than the cold muon catalyzed fusion can recoup.

But there is one Australian company that claims that it has found a new, less costly way to make pions. They are certainly a very interesting ‘cold fusion’ start-up and at first glance seem far more credible than the outfits that my fringe blog covers. But on the other hand, this company treats their proprietary pion production process with a level of secrecy that is reminiscent of the worst players in the LENR world. I could not find any hint of how this process is supposed to work and why it supposedly can produce sufficient amounts of muons to make this commercially exploitable. (Pions could also be generated in two photon processes, but this would require even more input energy). So on second read the claims of Australian’s Star Scientific don’t really sound any less fantastic than the boasting of any other cold fusion outfit.

Any comments that could illuminate this mystery are more than welcome. Preliminary google searches on this company are certainly not encouraging.

Time Crystal – A New Take on Perpetual Motion

Update: Here’s the link to Wilczek time crystal paper

Not a time crystal but perpetually moving at room temperature. (Illustration of Nitrogen-inversion).

It is a given that at room temperature there is plenty of perpetual chaotic and truly perpetual motion to be had.  And sometimes this motion takes on some more organized forms as is the case with Nitrogen inversion.

Also it is well established that unexpected movements can occur close to absolute zero, when for instance superfluid liquids climb up the walls of their containment.

In general, unperturbed quantum systems develop in a unitary manner (i.e. a kind of movement) and will do so perpetually, until measured.

In case of super-sized Rydberg atoms you can also approach an almost classical orbit (and that should hold at very low temperatures as well).  But to have sustained, detectable perpetual motion in the ground state of a system at absolute zero would be a new quality.

That is what “Time Crystals” might be adding to the quantum cabinet of oddities.  The idea that lead to this theoretical prediction, formulated by Frank Wilczek, is indeed quite clever:

“I was thinking about the classification of crystals, and then it just occurred to me that it’s natural to think about space and time together, (…) So if you think about crystals in space, it’s very natural also to think about the classification of crystalline behavior in time.”

It’ll be up to some creative experimentalist to determine if the resulting theory holds water.  If so, this may open up an interesting new venue to tackle the frustrating problem of getting General Relativity (where space and time is a combined entity) and QM to play together.

So You Want to Learn About Quantum Computing?

“Students will learn by inhabiting an alternate history where Alan Turing and Richard Feynman meet during World War II and must invent quantum computers to defeat Nazi Germany. As a final project, they will get to program a D-Wave One machine and interpret its results.”

If you are based in Seattle then you want to keep an eye out for when Paul Pham next teaches the Quantum Computing for Beginners course that follows the exciting narrative outlined above.

For everybody else, there is EdX‘s CS191x Quantum Mechanics and Quantum Computation course.  I very much hope this course will a be a regular offering.  Although it lacks the unique dramatic arche of P.Pham’s story line this course is nevertheless thoroughly enjoyable.

When I signed up for this course, I didn’t know what to expect.  Mostly, I decided to check it out because I was curious to see how the subject would be taught, and because I wanted to experience how well a web-based platform could support academic teaching.

This course fell during an extremely busy time, not only because of a large professional work load, but also because the deteriorating health of my father required me to fly twice from Toronto to my parents in Germany.  Despite this, the time required for this course proved to be entirely manageable.  If you have an advanced degree in math, physics or engineering, and want to learn about Quantum Computing, you shouldn’t shy away from taking this course as long as you have an hour to spare each week.  It helps that you can accelerate the video lectures to 1 1/2 normal speed (although this made Prof. Umesh Vazirani sound a bit like he inhaled helium).

Prof. Vazirani is a very competent presenter, and you can tell that a lot of thought went into how to approach the subject, i.e. how to ease into the strangeness of Quantum Mechanics for those who are new to it. I was suspicious of the claim made at the outset, that the required mathematics would be introduced and developed as needed during the course, but it seems to me that this was executed quite well. (Having been already familiar with the required math, I don’t really know if it’ll work for somebody completely new to it, but it seems to me that indeed the only pre-requisite required was a familiarity with linear algebra).

It is interesting to see discussions posted by individuals who took the course and were apparently subjected to QM for the first time.  One such thread started this way:

“I got 100. It was really a fun. Did I understand anything? I would say I understood nothing.”

To me this illuminates the fact that you simply cannot avoid the discussion of the interpretation of quantum mechanics.  Obviously this subject is still very contentious, and Prof. Vazirani touched on it when discussing the Bell inequalities in a very concise and easy to understand manner.  Yet, I think judging from the confusion of these ‘straight A’ students there needs to be more of it.  It is not enough to assert that Einstein probably would have reconsidered his stance if he knew about these results.  Yes, he would have given up on a conventional local hidden variable approach, but I am quite certain his preference would have then shifted to finding a topological non-local field theory.

Of course, there is only so much that can be covered given the course’s duration. Other aspects there were missing: Quantum Error Correction, Topological and Adiabatic Quantum Computing and especially Quantum Annealing.  The latter was probably the most glaring omission, since this is the only technology in this space that is already commercially available.

Generally, I found that everything that was covered, was covered very well.  For instance, if you ever wondered how exactly Grover’s and Shor’s algorithms work, you will have learned this after taking the course. I especially found the homework assignments wonderful brain teasers that helped me take my mind off of more worrisome issues at hand.  I think I will miss them. They were comprised of well thought out exercises, and as with any science course, it is really the exercises that help you understand and learn the material.

On the other hand, the assignments and exams also highlighted the strengths and weaknesses of the technology underlying the courseware.  Generally, entering formulas worked fine, but sometimes the solver was acting up and it wasn’t always entirely clear why (i.e. how many digits were required when giving a numerical answer, or certain algebraically equivalent terms were not recognized properly).  While this presented the occasional obstacle, on the upside you get the immediate gratification of instance feedback and a very nice progress tracking that allows you to see exactly how you are doing. The following is a screenshot of my final tally. The final fell during a week in which I was especially hard pressed for time, and so I slacked off, just guesstimating the last couple of answers (with mixed results).  In comparison to a conventional class, knowing exactly when you have already achieved a passing score via the tracking graph makes this a risk- and stress-free strategy.

Screen Shot 2013-04-27 at 11.56.31 AMA common criticism of online learning in comparison to the established ways of doing things is the missing classroom experience and interaction with the professor and teaching staff.  To counter this, discussion boards were linked to all assignments, and discussion of the taught material was encouraged.  Unfortunately, since my time was at a premium I couldn’t participate as much as I would have liked, but I was positively surprised with how responsive the teaching assistants answered questions that were put to them (even over the weekends).

This is all the more impressive given the numbers of students that were enrolled in this course:

The geographic reach was no less impressive:

Having being sceptical going into this, I’ve since become a convert.  Just as Khan Academy is revolutionizing the K12 education, EdX and similar platforms like Cousera represent the future for academic teaching.

 

Fun Stuff: When Shakespeare meets Schrödinger

Shakespeares_cat

In the associated LinkedIn discussion to my previous post, commenters had some fun with the Shakespeare inspired headline. Clearly, if Shakespeare would have known Quantum Mechanics and the superposition that holds Schrödinger’s cat in limbo, some of the classic pieces would have sounded slightly different. Dr. Simon J.D. Phoenix had this brilliant take on it:

“To be, or not to be, or maybe both

–that is the question:
Whether ’tis nobler in the mind to calculate
The slings and arrows of outrageous quanta
Or to take arms against a sea of interpretations
And by opposing end them.
To sleep, to wake —
No more, but both –and by a sleep to say we end
The headache, and the thousand natural shocks
That Bohr bequeathed. ‘Tis a consummation
Devoutly to be wished. To wake, to sleep–
To sleep–perchance to dream: ay, there’s the rub,
For in that sleep of Copenhagen what dreams may come
When we have shuffled all our mortal calculations,
Must give us pause. There’s the Aspect
That makes calamity of so entangled a life.
For who would bear the Bells and Wittens of time,
Th’ position’s wrong, the proud momentum’s contumely
The pangs of despised theory, the quantal law’s decay,
The insolence of academic office, and the spurns
That patient merit of th’ unworthy unlearned takes,
When he himself might his quietus make
With a bare bra-ket? Who would fardels bear,
To grunt and sweat under a weary state vector,
But that the dread of something not quite real,
The undiscovered counterfactual, from whose bourn
No traveller returns, puzzles the will,
And makes us rather bear those classical ills we have
Than fly to others that we know not of?
Thus common sense does make cowards of us all,
And thus the native hue of resolution
Is sicklied o’er with the pale cast of Heisenberg,
And enterprise of great position and momentum
With this regard their currents turn awry
And lose the name of action. — Soft you now,
The fair Dirac — noble and precise, in thy orisons
Be all my spins remembered.”