Note: this didnt start out this way, but as things developed, this turned out to be the first in a trilogy of posts on science, climate, energy, economics, technology, and the future. They all will be a bit meatier than the typical Blogmocracy posts, in the mold of Coldwarriors three-part series on the Balkans.
Wired magazine has a lot of junky stuff in it, and every once in a while, something brilliant. This is one such piece. The title is misleading, and somewhat banal, its about a lot more than just neuroscience. Its about philosophy of science, and and even political philosophy. Let me describe how this all applies to the Climategate fiasco. The article starts out talking about a case study in a failure that lead to a later success. Science, technology, and more broadly the history of mankind is full of such stories. But this was a lead-in to an actual study of science labs, and which ones are productive, and which ones arent, and why. Here we get to the crux:
Dunbar came away from his in vivo studies with an unsettling insight: Science is a deeply frustrating pursuit. Although the researchers were mostly using established techniques, more than 50 percent of their data was unexpected. (In some labs, the figure exceeded 75 percent.) The scientists had these elaborate theories about what was supposed to happen, Dunbar says. But the results kept contradicting their theories. It wasnt uncommon for someone to spend a month on a project and then just discard all their data because the data didnt make sense. Perhaps they hoped to see a specific protein but it wasnt there. Or maybe their DNA sample showed the presence of an aberrant gene. The details always changed, but the story remained the same: The scientists were looking for X, but they found Y.
Well, gosh. Thats werent not sposed to happen.
Dunbar was fascinated by these statistics. The scientific process, after all, is supposed to be an orderly pursuit of the truth, full of elegant hypotheses and control variables. (Twentieth-century science philosopher Thomas Kuhn, for instance, defined normal science as the kind of research in which everything but the most esoteric detail of the result is known in advance.) However, when experiments were observed up close and Dunbar interviewed the scientists about even the most trifling details this idealized version of the lab fell apart, replaced by an endless supply of disappointing surprises. There were models that didnt work and data that couldnt be replicated and simple studies riddled with anomalies. These werent sloppy people, Dunbar says. They were working in some of the finest labs in the world. But experiments rarely tell us what we think theyre going to tell us. Thats the dirty secret of science.
How did the researchers cope with all this unexpected data? How did they deal with so much failure? Dunbar realized that the vast majority of people in the lab followed the same basic strategy. First, they would blame the method. The surprising finding was classified as a mere mistake; perhaps a machine malfunctioned or an enzyme had gone stale. The scientists were trying to explain away what they didnt understand, Dunbar says. Its as if they didnt want to believe it.
When you think about this, hide the decline seems to fit right in with that not wanting to believe what the data are telling them. What was interesting about the most famous of CRUtape letters is that they took this denial (yes, thats a correct use of that word) of the data to the next level, and started working together to sweep the facts under the carpet.
So what is the take-away message so far? 1) life is full of surprises, and 2) if you go into your experiments looking for some particular result, theres a very good chance that youre going to end up fighting your own experiment. As Feynman said the easiest person to fool is yourself. But why do people react this way?
The experiment would then be carefully repeated. Sometimes, the weird blip would disappear, in which case the problem was solved. But the weirdness usually remained, an anomaly that wouldnt go away.
This is when things get interesting. According to Dunbar, even after scientists had generated their error multiple times it was a consistent inconsistency they might fail to follow it up. Given the amount of unexpected data in science, its just not feasible to pursue everything, Dunbar says. People have to pick and choose whats interesting and whats not, but they often choose badly. And so the result was tossed aside, filed in a quickly forgotten notebook. The scientists had discovered a new fact, but they called it a failure.
The reason were so resistant to anomalous information the real reason researchers automatically assume that every unexpected result is a stupid mistake is rooted in the way the human brain works.
Hmmm. So scientists arent androids. Who knew?
Over the past few decades, psychologists have dismantled the myth of objectivity. The fact is, we carefully edit our reality, searching for evidence that confirms what we already believe. Although we pretend were empiricists our views dictated by nothing but the facts were actually blinkered, especially when it comes to information that contradicts our theories. The problem with science, then, isnt that most experiments fail its that most failures are ignored.
So much for the myth of the scientist just looking for the facts, and letting the chips fall where they may.
Furthermore, when Dunbar monitored the subjects in an fMRI machine, he found that showing non-physics majors the correct video triggered a particular pattern of brain activation: There was a squirt of blood to the anterior cingulate cortex, a collar of tissue located in the center of the brain. The ACC is typically associated with the perception of errors and contradictions neuroscientists often refer to it as part of the Oh sh*t! circuit so it makes sense that it would be turned on when we watch a video of something that seems wrong.
This explains a lot, and not just about science and scientists. What hes just shown is that we all develop models of reality in our heads, and then resist evidence that contradicts those models.
The lesson is that not all data is created equal in our minds eye: When it comes to interpreting our experiments, we see what we want to see and disregard the rest. The physics students, for instance, didnt watch the video and wonder whether Galileo might be wrong. Instead, they put their trust in theory, tuning out whatever it couldnt explain. Belief, in other words, is a kind of blindness.
The lesson here seems clear, but it isnt. The physics students, in this case, were right. But being right doesnt mean that youre not engaging in selective cognition. While their understanding of Galilean gravity got them past the wrong intuition that the untrained people had, this same understanding caused a lot of people to reject Einsteinian gravity. So knowledge is a double-edged sword. It can reinforce your correctness, but it can also reinforce your wrongness. Remember this famous Reagan quote:
It isnt that Liberals are ignorant. Its just that they know so much that isnt so.
Seems like Ron understood something intuitively about this problem.
Now the author goes off on an interesting side trip:
In 1918, sociologist Thorstein Veblen was commissioned by a popular magazine devoted to American Jewry to write an essay on how Jewish intellectual productivity would be changed if Jews were given a homeland. At the time, Zionism was becoming a potent political movement, and the magazine editor assumed that Veblen would make the obvious argument: A Jewish state would lead to an intellectual boom, as Jews would no longer be held back by institutional anti-Semitism. But Veblen, always the provocateur, turned the premise on its head. He argued instead that the scientific achievements of Jews at the time, Albert Einstein was about to win the Nobel Prize and Sigmund Freud was a best-selling author were due largely to their marginal status. In other words, persecution wasnt holding the Jewish community back it was pushing it forward.
The reason, according to Veblen, was that Jews were perpetual outsiders, which filled them with a skeptical animus. Because they had no vested interest in the alien lines of gentile inquiry, they were able to question everything, even the most cherished of assumptions. Just look at Einstein, who did much of his most radical work as a lowly patent clerk in Bern, Switzerland. According to Veblens logic, if Einstein had gotten tenure at an elite German university, he would have become just another physics professor with a vested interest in the space-time status quo. He would never have noticed the anomalies that led him to develop the theory of relativity.
Predictably, Veblens essay was potentially controversial, and not just because he was a Lutheran from Wisconsin. The magazine editor evidently was not pleased; Veblen could be seen as an apologist for anti-Semitism. But his larger point is crucial: There are advantages to thinking on the margin. When we look at a problem from the outside, were more likely to notice what doesnt work. Instead of suppressing the unexpected, shunting it aside with our Oh sh*t! circuit and Delete key, we can take the mistake seriously. A new theory emerges from the ashes of our surprise.
Based on the resounding success of Israeli science and technology, this seems to be disproven. But wait. There WE go jumping to a specious conclusion. Since there are approximately the same number of Jews in the US and in Israel, a fair comparison would be the scientific achievements of American v.s. Israeli Jews. I dont have that information at my fingertips, but I think that there are considerably more American than Israeli Jews who have received various Nobel Prizes over the past 60 years, confirming the hypothesis. Anyway, this is a whole interesting discussion unto itself, but for now, well have to say the jury is out. But on to the money quote vis-a-vis climate science:
Modern science is populated by expert insiders, schooled in narrow disciplines. Researchers have all studied the same thick textbooks, which make the world of fact seem settled. This led Kuhn, the philosopher of science, to argue that the only scientists capable of acknowledging the anomalies and thus shifting paradigms and starting revolutions are either very young or very new to the field. In other words, they are classic outsiders, naive and untenured. They arent inhibited from noticing the failures that point toward new possibilities.
Is that what is settled Mr. Gore? The naive and untenured should be prevented from publishing, right Mr. Mann?
But not every lab meeting was equally effective. Dunbar tells the story of two labs that both ran into the same experimental problem: The proteins they were trying to measure were sticking to a filter, making it impossible to analyze the data. One of the labs was full of people from different backgrounds, Dunbar says. They had biochemists and molecular biologists and geneticists and students in medical school. The other lab, in contrast, was made up of E. coli experts. They knew more about E. coli than anyone else, but that was what they knew, he says. Dunbar watched how each of these labs dealt with their protein problem. The E. coli group took a brute-force approach, spending several weeks methodically testing various fixes. It was extremely inefficient, Dunbar says. They eventually solved it, but they wasted a lot of valuable time.
The diverse lab, in contrast, mulled the problem at a group meeting. None of the scientists were protein experts, so they began a wide-ranging discussion of possible solutions. At first, the conversation seemed rather useless. But then, as the chemists traded ideas with the biologists and the biologists bounced ideas off the med students, potential answers began to emerge. After another 10 minutes of talking, the protein problem was solved, Dunbar says. They made it look easy.
Which brings up another major climate issue: multidiciplinarianism. When McIntyre and McKitrick trashed Manns hockey stick, it was a case of a couple of statisticians elbowing their way into the climatological lair. Mann et all basically told them that he knew everything that he needs to know about statistics, and wont be needing their assistance. But which approach discovers the truth faster? Hands-down, the multidisciplinary team.
I think you can see how the climate train wreck is a perfect illustration of all of this, but in a more general sense, it explains cognitive dissonance, and why theres so much la la la, I cant hear you when Teh Won is criticized. Its because a similar setup is operating: they all live in their hermetic cloisters, and hear nothing but echos of how wonderful Dear Leader is, and when counterevidence surfaces, the oh-sh*t circuit wipes it out just like antivirus software.
I think its obvious how this sets up self-reinforcing social networks.
Read the following and let me know how you think it will end for Obama and the Dems. This is some scary stuff.
http://www.americanthinker.com/2010/01/2010_will_be_worse.html