gall and gumption

Thursday, January 11, 2007

Perils of Peer Review

Here’s one scientist suggesting that peer review might not be all it's cracked up to be.

Historically, it seems that gentlemen-scientists with an elevated social position and independent means, fared the best in science. In the 16th century, at the dawn of the modern era, one such individual, the son of a middle-class family, holder of the respectable office of Canon of the Frombork Cathedral, published a book on the revolution of celestial bodies that revolutionized understanding of our place in the universe. The man was Nicholas Copernicus. His accomplishment was simple: he moved the Earth from the center of the world onto an orbit around the Sun. Being supported by the endowments attached to his post and having received a thorough education at Polish and Italian universities, he had the time and knowledge to conduct astronomical observations and to write a lengthy manuscript in relative peace. He realized the revolutionary nature of his work and dreaded the opinion of many learned colleagues as he clearly spoke against the then accepted views. And yet the book was published. It would not be possible today to publish a book full of minor errors and simplifications and arguing something so ludicrous as a complete reversal of a consensus of well-established authorities. Would it also be possible for a modern scientist to obtain a grant for a proposal based on the supposition that the major publications on the subject are wrong and that he will prove it by conducting observations from the roof of his residence? Absolutely not. Peer review protects us against such lunacies.[emphasis mine -- ed]


Hmm. Well. With enough money, a good education and time, people can think and produce interesting, original work. I notice that. I'm not sure how our culture worked around to the idea that that life path ought to be for an educated and privileged few while other people can just pick things up off the floor at Wal-Mart all day and hope thereby to eventually buy a Playstation. But there we are.

Pardon that little socialistic digression. I have a lot of thoughts about this passage and the little socialistic digression is one of them.

I got this email from Norman, a regular reader, with that quote in it. It's something he's been thinking about.

One thought I had, when Norman first wrote me about this subject, I wrote back to him. It was how Marvin Mudrick encouraged the free use of generalizations. One of the weird things that happens in my life is the way people will be shocked or made nervous by something I say that's really just an observation thrown out to be kicked around and banged into shape via conversation. Vigorous conversation.

Well, first of all I come from the Caribbean, Jamaica especially, but I would say that most of the English-speaking Caribbean conducts its life in a constant state of imminent shoutingness. Except possibly Antigua, where they have perfected the art of tautology. They sound like Jamaicans, but, as my mother says, "What's missing is the feeling that anything could happen at any minute." In Jamaica, especially, this feeling of imminence can be quite breathtaking: maybe because what could happen could involve the firing of guns. Caribbean people get into shouting arguments, they call each other (sometimes quite filthy) names, in some parts of the Caribbean a person who gets angry will just stand on the street corner and bitch for like 45 minutes. It is a very rhetorical culture, where the words you say are not you, exactly, and you aren’t using them to create yourself, you’re using them in swordplay.

They aren't the sincere effusions of your trembling soul, you know, unless you are trying to get laid. Talk is recognized to be play, in which insight can sometimes come breaking through. One of the things I love about the Caribbean is how people will kick a subject around.

When I met Marvin Mudrick, about 8 months after I moved to California to go to college, I recognized in him a similar use of talk, but more focused. He used talk to get somewhere. That is, if he threw out a generalization about, say, the 18th century, you could check it against your observations in the 18th-century literature that you were reading at the time. A generalization was a place from which you launched out into a subject, with the assumption that your observations -- or the observations of other readers in conversation with you -- would correct the generalization. That was what it was out there for, to be corrected, to be filled with content, or to be discarded as not sufficiently descriptive or for any number of reasons that resulted from investigating the questions it raised.

He would tell us to make generalizations because they helped us to think; they weren’t statements of dogma, they were really just guesses, conjectures, rough statements. The thing was not to be too attached to any of them, no matter what their source, until we really knew that we knew the subject enough to make an informed judgment.

Popper, cited in the article linked to above, said that it was by making and discarding generalizations (conjectures) that scientific knowledge actually advanced. See Conjectures and Refutations: The Growth of Scientific Knowledge.

Einstein's theory of relativity is something to understand because for one thing, it's really hard to see how that could have come out of any inductive process of gathering evidence and only inferring the theory from accumulated observations. It's this tremendous work of imagination, the way Cosi Fan Tutti is a tremendous work of imagination. When Einstein first suggested that light waves were bent by gravity there was no way to test whether this was true -- the technology did not yet exist. If it was false it wasn't going to be half false (or half true): it would be totally demolished. (One of the best accounts of all this is Richard Feynman's The Character of Physical Law.

There is no knowing without vigorous imagining. Anything else is just rote memorization, it's not integrated with the rest of your experience, it's bloodless, it can't create anything. And if you want people to imagine vigorously they have to be able to imagine freely. Which means having the freedom to speak unimpaired by the thousand gazillion little status-related anxieties and obsessions that rule conversation in this yer land of the free. It is very simple; to the extent that your personal status defines what you can say in a particular field, that field is corrupted – now we are humans in an imperfect world so status anxiety will sneak up on us individually, and we hope to become wiser with time and honest effort – but when it is totally integrated into an institution’s thinking, radioactive poo has leached into the system.

It takes vigorous imagination to take hold of the facts. We screw this up as humans all the time. And so we have a kind of rudimentary provision against totally screwing up hard-wired into us. Suppose I'm looking for my keys, trying to leave the house. I look in my jacket. Nope, not there. I've established that, at least, I've eliminated one bad theory. So I go look on the bathroom counter, maybe it's there. If I kept looking in the same jacket pocket because I had seen them there last week -- how stupid would that be? Well, we would never have evolved. We'd all still be in caveman days, stretched out dead in various stages of decomposition next to the bush with the poisonous berries.

Another interesting thing about being human is that we seem to have some choice about whether we will use this handy provision or not. I’m guessing that the choice is adaptive; the most adaptive thing about us is our adaptability, after all.

So we find that we have another out, which is making our theories and conjectures what Popper calls immune to criticism. Well, it looks like an out. You can't do this in anything where the outcomes matter to anybody. You can't do it in medicine for very long, you can't build bridges. I said that you can't do it in medicine, but in fact medicine used to do it. The Duc de Saint-Simon records how one after another of the heirs of Louis XIV died at the hands of doctors, who treated them for every ailment by bleeding them with dirty lancets, and then, wow, they died. Who would have guessed? They killed off almost all of them, and it was one duchess who hid Louis XV, the grandson, from the doctors or they would have done away with him too. But while this was going on nobody questioned it. The doctors would have been deeply offended if anyone had. They had this theory that the blood was impure, and what could this woman know about it? Every observation of a patient's symptoms would refer back to this theory of the blood, and an explanation could always somehow be found that would be further "proof" of the truth of this business about the impurity of the blood. That the doctors themselves were introducing fatal impurities into the blood was a mere empirically based conjecture, not like those profound and subtle theoretical things that lay people cannot possibly understand, and it was a sort of impertinence to suggest otherwise. Which reinforces the theory in spite of the evidence. So the theory was always right, there could be no evidence outside of the theory. Except, Popper says, (and this is my example, not his), the infallibility of this “theory” is the surest proof that it is not scientific.

Somehow, though, the critical attitude can get turned, not inward to look at one’s own practices and assumptions, but outward, to ward off interlopers. (Not hard to see why, it’s so comfortable, especially when you are a mediocrity.) That’s the situation Henneberg is describing. Now, you do have to ward off interlopers, but there is simply no safer way to do it than on the merits of the content. If you do it on any other basis it is an invitation to the totally venal careerists, to the out-and-out loonies, the gormless toadies, to come in and loot the place. Because if your content is not positively asserted, you are simply sitting on an asset, and someone sooner later will demand to know why you get to have this and not give them a piece. The whole endeavor becomes a battle over spoils and privileges and soon there is no other subject, and the discourse gets more and more degraded.

If you want to see what that state of affairs looks like, look in English departments, because for one thing almost nobody cares what academic literary theorists and critics do, so they’ve been able to immunize their activities against all criticism by a number of strategies, none of them new or original. In science at least the content is still there. Sooner or later any conversation about issues in literary study becomes a conversation about status. If you criticize a piece of New Historicist criticism, for instance, you can only criticize it as being good or bad of its kind. If you say that you think New Historicism isn't very interesting or useful, well, you're one of the Unwashed -- unless you are a member of one of the other constituencies that have all agreed to sort of mutually and uneasily tolerate one another. And if you are a total non-subscriber to any of the parties to this truce, if you say that there is something about all of these theories that is intellectually unsatisfying, well, you're a freak. Not a serious person. Because you don't have an "approach." And if you haven't adopted one, you aren't behaving. Well, you can see how self-reinforcing that is, it's a closed loop. And it is based on nothing. Absolutely nothing. The work of literary scholars and theorists is no more scientific than The Dunciad. "Well, it's not claiming to be scientific," someone protests. Well, then, like almost everything else that is written, it's poetry. (It’s poetry in the sense meant by Sir Philip Sidney, that is, it is “fictive,” yes, even when it’s nonfiction; “fictive” doesn’t necessarily mean “untrue” but that's a discussion for another day.) Which is fine, but then why does it have to be such bad poetry?

1 Comments:

At 9:03 AM, Anonymous Norman said...

In 1847 Ignaz Semmelweis found that requiring everyone at the Vienna Maternity Hospital to wash their hands in disinfectant before examinations drastically reduced the death rate from childbirth fever. With a few exceptions, the medical community of the time rejected his findings. Over the next 20 or 30 years, until antiseptic methods were introduced generally, thousands of avoidable deaths occurred. Medicine had improved since the days of the Duc de Saint-Simon, but one thing was the same: the confidence of the experts in the completeness of their knowledge.

 

Post a Comment

<< Home