Thursday, April 20, 2006

RIP, Scott Crossfield

Some sad news today. Scott Crossfield died earlier today when his Cessna went down in Georgia.

I must confess to conflicting emotions, here. On the one hand, it's sad to see him go. He was one of the greats. First to Mach 2, first to fly the X-15, a true pioneer of aviation. And he stayed active well into his retirement. But ... you have to know that he'd have hated living long enough to have to hang up his scarf. To a man like that, flying is like breathing.

Vaya con Dios, Mr. Crossfield. It's all VFR from here on out, and not a medical examiner in sight. And if that isn't Heaven, well, what is?

Saturday, April 01, 2006

Seven Warning Signs

I found a rather interesting article while reading Jerry Pournelle's site, titled "Seven Warning Signs of Bogus Science".

Mind you, I'm not really a scientist. I'm an engineer by education, and a code monkey by vocation. Ten years of grad school convinced me that I really only know three things for sure: Conservation of Mass, Conservation of Momentum, Conservation of Energy. Everything else is speculation. But ... there's a heck of a lot of utterly bogus "science" going on out there. Someone has got to flange up some sort of BS detector. I tried, once, to do something like that. It ended up being one of those projects that never quite went anywhere.

What I was getting at was this: no one ever really teaches the scientific method anymore. Oh sure, they mention it in passing at some point in most science classes. But they never go much deeper than that. Science is almost always taught as a body of stuff to be memorized, not as a way of thinking.

I have some practical knowledge of why this is the case. It's freaking HARD to get the little buggers to do ANY independent thinking. And a proper treatment of the scientific method is a pretty rigorous exercise in independent thinking.

The problem is, people confuse assertion of "new" facts with actual science. How are they to tell the genuine article from total baloney?

The Seven Warning Signs look to be a great place to start.

I'd add two broad principles to the list, though. They're important enough to deserve that treatment. The two principles are key to the scientific method: Falsification, and Repeatability.

Falsification: We don't know the strength of a steel bar because we've got some magical method for knowing the strength of a crystal lattice. We know the strength of a steel bar because we've bent the damn thing until it broke. Bent it U-shaped, stretched it until it snapped in two, twisted it until it gave way. Do that a few thousand times, and you will know without having to ask whether a given bit of stainless steel alloy will do the job you're asking of it. It's the same with a scientific theory. The accumulated tests that prove it true don't count. It's the tests built specifically to prove the theory false that make or break it. If you can't design such an experiment, there's something wrong with the theory. And, usually, what's wrong with such a theory is that it can't predict anything worth knowing.

Repeatability: You have to be able to describe the experiment completely enough that another experimenter can do what you did, and see the same things you did. If you can't, you probably don't understand what's going on as well as you think you do. If you won't, you've probably got something to hide. For us third-party observers, repeatability is a key thing to watch for. If other researchers are having a hard time reproducing "remarkable" new results ... there might not be anything to report, after all. It's when other people in the field report that they can repeat the experiment successfully that you know something interesting might be going on.

Armed with this information and a healthy dose of skepticism, you'll be better equipped to separate the real deal from the baloney. Now, if we could just get this out to the general public ... But that may be too much to ask.