Cargo
Can't remember how I found my way to this Scientific American article on self-deceit, a subject in which I find a tireless fascination. I read it and couldn't help thinking how novels tell the same story better. I recommend Michael Kohlhaas. And Anna Karenina. And John McGahern's autobiography, already mentioned here. It does seem to me that the pace of discovery in psychology is awfully slow compared to the pace of discovery in literature. I mean, psychology keeps discovering things that literature has known since the Greek tragedies.
I mean, I'm sure this book under review is very nice and useful, but there is an air of being the last to arrive on the scene.
By coincidence I've been writing about it (again, yes!). I'm just going to post it because otherwise I will keep fiddling with it.
I've been thinking about it because of the endless fascination aforementioned, and also because I recently re-read this famous talk by Richard Feynman, the “Cargo Cult Science” speech. I read it years ago and it seems even better now than it did then.
I think the educational and psychological studies I mentioned are examples of what I would like to call cargo cult science. In the South Seas there is a cargo cult of people. During the war they saw airplanes with lots of good materials, and they want the same thing to happen now. So they've arranged to make things like runways, to put fires along the sides of the runways, to make a wooden hut for a man to sit in, with two wooden pieces on his head to headphones and bars of bamboo sticking out like antennas -- he's the controller -- and they wait for the airplanes to land. They're doing everything right. The form is perfect. It looks exactly the way it looked before. But it doesn't work. No airplanes land. So I call these things cargo cult science, because they follow all the apparent precepts and forms of scientific investigation, but they're missing something essential, because the planes don't land.
Now it behooves me, of course, to tell you what they're missing. But it would be just about as difficult to explain to the South Sea islanders how they have to arrange things so that they get some wealth in their system. It is not something simple like telling them how to improve the shapes of the earphones. But there is one feature I notice that is generally missing in cargo cult science. That is the idea that we all hope you have learned in studying science in school -- we never say explicitly what this is, but just hope that you catch on by all the examples of scientific investigation. It is interesting, therefore, to bring it out now and speak of it explicitly. It's a kind of scientific integrity, a principle of scientific thought that corresponds to a kind of utter honesty -- a kind of leaning over backwards. For example, if you're doing an experiment, you should report everything that you think might make it invalid -- not only what you think is right about it: other causes that could possibly explain your results; and things you thought of that you've eliminated by some other experiment, and how they worked -- to make sure the other fellow can tell they have been eliminated.
Details that could throw doubt on your interpretation must be given, if you know them. You must do the best you can -- if you know anything at all wrong, or possibly wrong -- to explain it. If you make a theory, for example, and advertise it, or put it out, then you must also put down all the facts that disagree with it, as well as those that agree with it. There is also a more subtle problem. When you have put a lot of ideas together to make an elaborate theory, you want to make sure, when explaining what it fits, that those things it fits are not just the things that gave you the idea for the theory; but that the finished theory makes something else come out right, in addition.
In summary, the idea is to give all of the information to help others to judge the value of your contribution; not just the information that leads to judgment in one particular direction or another.[emphasis added – kp]
It’s worth paying close attention to how Feynman uses language here. He hesitates for a minute, looking for the words to describe this quality that these young scientists must have. You can see him trying out different ways of saying it; he tries “ kind of scientific integrity”, he tries “utter honesty,” he tries “bend over backwards” and at last puts it as a simple and practical imperative: “You should report everything that you think might make it invalid.” Then the words come easily, fluent, specific. He wasn’t terribly interested in things like this in the abstract, and what he has to do is sum up what is a pervasive daily practice in small things, in an infinite variety of situations, into just a few words.
In those few words you get a rare sighting of the roots of science in Enlightenment humanism. Scientific integrity, or intellectual honesty, is not a scientific quantity; it’s a humanistic value. He’s speaking the language of Montaigne, of Locke, of Hume, of Voltaire. Not even in its succession of “revolutions” – Kepler, Newton, Darwin, Einstein – has it strayed from those roots. The science changed, but the underlying humanistic ethic has not changed. And here’s Feynman, who was not terribly interested in humanism or the philosophy of science finding it indispensable; it’s what distinguishes science from scientism.
University administrations have all sorts of policies prohibiting what they call “intellectual dishonesty,” by which they mean cheating on exams and plagiarism. But that sort of thing is not what Feynman is talking about here; he’s really talking about self-deceit, the practice of hiding the truth from oneself.
When Feynman goes on to discuss the importance of scientific integrity, he frames that discussion in terms of consequences. He talks about the trust that has to exist between the scientist and the nonscientist.
For Feynman, the technocrat’s superior knowledge entailed responsibility to speak the truth. Feynman helped to build the atom bomb, so he had as much of the technocrat’s power as anyone. Why shouldn’t you help governments to deceive themselves? You can. Why shouldn’t you mislead the public? You can. Why shouldn’t you deceive yourself or your colleagues? You can. The consequences are harmful, to your ability to learn, to the public’s understanding of your subject, and to the trust that members of a society need to be able to have in one another in this shared undertaking of learning. This is an ethics of consequences, and it is surprisingly consistent and unambiguous. It doesn’t ask, “What would you do?” It tells you what you must do, and points to consequences. Not “You will be punished” consequences, but “It will not work and the system will break down” consequences. Everything else was academic. Feynman, an academic superduperstar, was uninterested in academic questions.
Feynman loved his subject. There’s a moment in What Do You Care What Other People Think? when he describes himself, at last, standing really on the edge of the known universe, the outer boundary of knowledge of physics. His response was to write a poem. He was moved. And one of the consequences of intellectual dishonesty is the loss of that pleasure, that delight and wonder, because you cut yourself off from it when you choose to live inside of your own self-serving fictions.
He gave his “cargo cult science” speech in 1974. In 1986 he joined the commission investigating the Challenger explosion . Sure, it helped that Feynman was a mechanical genius, but it also helped that he had the very intellectual integrity that he was urging on those students years before. Understanding came out of tragedy, because Feynman was a person who could be trusted.
At the conclusion of the MIT speech, he says it’s dangerous “to teach students only how to get certain results, rather than how to do an experiment with scientific integrity,” the word “dangerous” has some content now. Without integrity, the whole thing collapses, and the consequences extend far into the world.
Another example is the ESP experiments of Mr. Rhine, and other people. As various people have made criticisms -- and they themselves have made criticisms of their own experiments -- they improve the techniques so that the effects are smaller, and smaller, and smaller until they gradually disappear. All the para-psychologists are looking for some experiment that can be repeated -- that you can do again and get the same effect -- statistically, even. They run a million rats -- no, it's people this time -- they do a lot of things are get a certain statistical effect. Next time they try it they don't get it any more. And now you find a man saying that it is an irrelevant demand to expect a repeatable experiment. This is science?
This man also speaks about a new institution, in a talk in which he was resigning as Director of the Institute of Parapsychology. And, in telling people what to do next, he says that one of things they have to do is be sure to only train students who have shown their ability to get PSI results to an acceptable extent -- not to waste their time on those ambitious and interested students who get only chance results. It is very dangerous to have such a policy in teaching -- to teach students only how to get certain results, rather than how to do an experiment with scientific integrity. [emphasis added – kp]
What is the danger? Short-term, spacecraft fall out of the sky; bridges collapse; levees don’t get fortified; medications cause horrible undisclosed side effects. And that’s quite bad enough, certainly. But there is another long-term danger that is much broader in its consequences for everyone; the loss of the tradition, the habitual practice of intellectual honesty, and the resulting distortion of what it means to know something. The long-term danger is the entrenchment of self-deceit and the corruption of judgment, attended with reduced expectations of results and reduced expectations of ethical standards. Which ensures that more catastrophic mistakes will be made. Feynman warned us that we couldn’t take this tradition for granted. It wasn’t enough to agree with it or to think it was a fine thing; you had to practice it. The only meaningful agreement with a principle is to practice it. Your mere cost-free approval, approval in theory, means nothing at all.
I think the reason Feynman bothers with the ESP experiments is to illustrate how commonplace this type of self-deceit is, and how easily it achieves respectability. Intellectual honesty requires truthfulness in the use of language. There is a form of intelllectual dishonesty that exploits the ambiguity of words to put itself in the right. So a certain fixity of agreement as to what words mean, and how they mean, is needed. You can’t resolve an issue by suddenly claiming that a word means something different from what it has been generally understood to mean within the discussion. That is, unless you’ve really discovered something new, in which case you’d still have to explain it and get the new meaning accepted. The language in shwich science is conducted, consequently, is scrupulously referential. The ESP Institute experiments “succeeded” as far as they did because Mr. Rhine discarded more and more data that didn’t fit; he was, in effect, changing the meaning of words after the fact. There is no surer sign of failure; the person who uses this sad trick has first deceived himself.
1 Comments:
keep coming back to this because it seems so basic, important.
link is broken.
Post a Comment
<< Home