Motivated Reasoning and the pscyhology of conspiracy theorists
Results of a survey by Australian psychologists...
------------------------
http://www.newyorker.com/online/blogs/elements/2013/04/conspiracy-theory-climate-change-science-psychology.html
What a Conspiracy Theorist Believes
Posted by Gary Marcus
<snip>
...The more people believed in free-market ideology, the less they believed in climate science; the more they accepted science in general, the more they accepted the conclusions of climate science; and the more likely they were to be conspiracy theorists, the less likely they were to believe in climate science.
These results fit in with a longer literature on what has come to be known as motivated reasoning. Other things being equal, people tend to believe what they want to believe, and to disbelieve new information that might challenge them. The classic study for this came in the nineteen-sixties, shortly after the first Surgeon Generals report on smoking and lung cancer, which suggested that smoking appeared to cause lung cancer. A careful survey revealed that (surprise!) smokers were less persuaded than nonsmokers were. Nonsmokers believed what the Surgeon General had to say. Smokers heaped on the counterarguments: many smokers live a long time (true, but ignores the statistical evidence), lots of things are hazardous (a red herring), smoking is better than being a nervous wreck, and so forth, piling red herrings on top of unsupported assumptions. Other research has shown a polarization effect: bring a bunch of climate change doubters into a room together, and they will leave the room even more skeptical than before, more confident and more extreme in the their views.
There may be some evolutionary advantage to having minds that reason in this way, bobbing and weaving and often avoiding the truth, but elsewhere, in my book Kluge: The Haphazard Evolution of the Human Mind, I have speculated that it is more bug than featurea neural glitch of how our memories are retrieved (mainly by finding matches to retrieval queries, which leads to confirmation bias, rather than through more systematic searches that might reveal disconfirming evidence that could potentially challenge ones beliefs). A parallel phenomenon can contaminate our ability to listen to others; we tend to dismiss that which challenges our beliefs, while accepting confirming evidence. Cass Sunstein, of Nudge fame, has an interesting new technical paper on this.
Given that we live in a country in which the theory of evolutionone of the most powerful theories in all of scienceis routinely dismissed, and one in which climate-change experts have struggled for years to persuade the public that there is a clear and present danger despite reams of data supporting them, serious investigations into the logic of crowds in real-world situations may represent an important step forward in understanding how to reason with less-than-reasonable masses.