Friday, August 27, 2010

Know your science

Except for the TV show "The Big Bang Theory", popular culture gets science wrong. We all know that.

But there's a way it tends to get science wrong that upsets me more than most. That is when it misuses the tools of science by willfully ignoring what science actually means.

One common example is celebrity equations, wherein some mathematical-looking expression mixes two or more celebrities together, as in (I'm making this one up and I'm not a cultural critic, let alone a comic, so please bear with me): Lady Gaga = (2*Madonna + Carrot Top)/3. Mathematically savvy readers will recognize that I normalized that equation. If you don't know what that means, you shouldn't be writing celebrity equations, because mathematical equations mean something, they're not just symbols. Like musical comedy based on bad notes, bogus mathematical equations are not funny, just lazy.

Some years ago I even wrote a letter to Entertainment Weekly when they had a long article full of egregious celebrity equations. To their credit, they published the letter and even mended their ways for a while. I quote the letter here:

According to EW math, the more buzz or intelligence you have, the less likely you are to be on the It List. That may be true, but I bet you didn't mean that. Your equation is art-directed nonsense. EW seems to think the joke is that the equations look cute: If Einstein is funny, his square root is hilarious...
In short, mathematics may look funny if you don't understand it but that doesn't make it funny if you misuse it in ignorance.

Another sort of abuse is comedy periodic tables: periodic tables of the vegetables, period table of the desserts, periodic table of the presidents, and on and on. There are zillions of them. I believe the vegetables one was the first widely distributed example.

What's wrong with them? Again, they miss the point about the one true periodic table, Mendeleev's periodic table of the elements. In fact, to put things with no structure into a periodic table not only misses the point of the periodic table, it misses the profound idea that some things have periods.

Mendeleev's table, by recognizing the periodic structure of the elements, predicted not only properties of the elements, but the very existence of undiscovered elements. It was a breakthrough.

The periodic table is not some artistic layout of letters, it's science at its very best, one of the great results of the 19th century and the birth of modern chemistry. It doesn't honor science to take, say, typefaces and put them in a funny-looking grid. That just mocks the idea that science can predict the way the world works.

Science is not arbitrary. Making arbitrary cultural artifacts by abusing scientific ideas is not just wrong, it's offensive. It cheapens science.

Another area of abuse is quantum mechanics, and a common victim is Heisenberg's uncertainty principle. Despite what some ill-informed academics would have you believe, Heisenberg's principle is not some general statment about weird shit happening in the world, it is a fantastically precise scientific statement about the limits of measurement of two simultaneous physical properties: position and momentum. It's not a metaphor!

What's really sad is that many of the commonest misuses of the terminology of quantum mechanics come from other areas of science and technology. For instance, there is a term in computer engineering called a Heisenbug, which refers to faults that are unpredictable, most often for bugs that go away when you examine them. It's a cute name but it isn't even a correct reference. The quantum mechanical property of things changing when you observe them is not the Heisenberg uncertainty principle, it's the observer effect. These two ideas are often confused but they are not the same. They're not even closely related.

The observer effect in quantum mechanics describes how the act of measuring a quantum system forces the system to cough up a measurable quantity, which triggers a "wave function collapse". Heisenberg's uncertainty principle says that the minimum product of the error in simultaneous measurement of a particle's position and momentum is Planck's constant divided by 4π, or as we write it in physics, ℏ/2. (By the way, that's an extremely small value.)

Not only are these very different ideas, neither of them has anything to do with computer bugs. The term Heisenbug is trendy but bogus and ignores some strange and beautiful ideas. It's no better informed than the square root of Einstein or the periodic table of the typefaces.

If you're going to use the terms of science to inform your world, please make a point to understand the science too. Your world will be richer for it.