In line with the suggestions I will give later on...
WARNING: The following text is written by a layman and as such is likely to contain errors. Said errors will hopefully be corrected later on.
Observation: People of the general public are often wrong about scientific issues. (e.g. Global warming, unwarranted trust in alternative medicine, fears regarding vaccination, creationism and so on) When trying to correct people on their mistakes, they usually believe them even more.
An example from personal testimony: I discussed the irrational beliefs of my aunt and her family (aunt, uncle, male cousin 1, male cousin 2, female cousin) with my male cousin 1 (from now on referred to only as "cousin"). One of their beliefs is that Erich von Däniken is (at least partially) right about UFO's and extra-terrestrials. In fact, my cousin staunchly believes not only that Aliens are possible (as do I) or that they may have landed in the past (as I do not), but in fact that they are on earth as we speak and that the government is actively trying to hush up their existence. Part of his "evidence" comes from von Däniken.
So one dreary afternoon, I tried to convince him that von Däniken was not only wrong and a liar, but that he even admitted to fabricating evidence. I showed him the fabrications, I showed him articles from notable scientists who disagreed with von Däniken and exposed him, and I even showed him quotes from von Däniken admitting to fabricating them.
You have one try to get this right: Did he change his mind, yes or no?
No, of course not. Evidence doesn't get in the way of ones beliefs, we know that from creationists. In fact, he defended von Däniken all the more after I showed him the quotes.
Problem: Now to get in a quote that's often misattributed to Stephen Hawking but is actually from Daniel J. Boorstin: "The history of Western science confirms the aphorism that the great menace to progress is not ignorance but the illusion of knowledge."
What we have is a classic example of a whole number of cognitive biases stopping an individual from reaching logical and scientifically sound conclusions.
Aim: Now based on the above, I'd like to do the following: Using research from psychology but also using posts from blogs, come up with a number of suggestions that can help people argue more effectively and counteract the cognitive biases.
This is a work in progress, any corrections and suggestions are of course welcome.
As has already been pointed out on multiple occasions, there is a truly awesome series on YouTube called Psychology of belief. It is suggested you acquaint yourself with it before reading on.
The following is largely based on Lewandowsky et al. (2012). The gist of the whole thing can be found either here or, obviously, below.
By the way, Lewandowsky has some AMAZING stories online for public viewing. LINK
In the study, Lewandowsky et al. take us back through well-known truth-denials such as Obama's citizenship and the MMR vaccine wave-of-fear. A very interesting one is the Listerine false-claims retraction campaign. Basically, Listerine (a mouthwash) was claimed to cure colds, although it obviously couldn't. The FTC then told the company to stop spreading BS and retract the claim in an ad-campaign worth $ 10.2 million. They did just that, but a few years later, 57% of consumers still cited a key factor in buying Listerine was because it could cure colds. (compare: only 15% of consumers of a competing product)
Basically, getting rid of misinformation is really, really difficult.
An important notice:
I haven't put the research in the references below, but if you want to I can tell you exactly which papers they cite.
Now I think that paragraph on its own is, although many of us may already have suspected or know, very powerful. It tells us that creationists are probably also operating under the Dunning-Krueger-effect and they reject science all the more vigorously because of it. It would be easier to teach someone from scratch than have the person drop his/her beliefs and THEN learn it. I think it was AronRa who once said something similar to "I wouldn't be able to teach you because I'd first have to drive all the nonsense out of you first."
Another interesting, though not necessarily novel, tidbit is that emotionally evocative news or rumours are more likely to spread than neutral one. This is also what The Herb Garden Germination (Big Bang Theory) is about. What's more interesting though is that people will actually extract "knowledge" from sources they know are fiction and that even warnings aren't effective.
Again, a quote:
So next time you read Harry Potter or watch Star Trek, keep a buzzer at the ready!
And if you want to laugh at Fox News watchers, do so now:
There are, of course, a myriad of things a deceitful person might do to hide the bollocks they're peddling: Print in high-color contrast, present it rhymed, deliver in familiar accent and print in easy-to-read font. (Comic Sans?!? Maybe that's why it's so popular!) Interestingly, the creationist tactic of "I'm a Doctor, Ph.D" seems to be sound: Expert testimony is always more persuasive and names (Exxon, BP, etc.) also give a story more credibility, even if the company has vested interest. (e.g. a climate-change is wrong study by Exxon)
What was new to me was that even a single person repeating a rumour multiple times can increase the chance of a person believing it.
"In a very real sense, a single repetitive voice can sound like a chorus."
An interesting add-on to that:
Very striking. This might explain why creationists often suggest that "many scientists" are abandoning evolution.
Right, so on to "correcting misinformation". Multiple studies were done on neutral events (random building burns down, black smoke caused by improperly stored oil paints and gas cylinders) followed by a retraction in one group (there were no oil paints/gas cylinders) and no retraction in the control group. Depending on the study, either half of the people correct themselves or none at all.
I think that claim is properly outrageous, even though Lewandowsky et al. cite sixteen (16) different studies that did just that, so it evidently is true. Here's another quote, just to show just how crazy our mind is:
On to the mental models: One can be summed up with "people want to have some explanation, no matter how false, rather than none at all". That doesn't explain why people stick to misinformation even after that has been corrected. A second one could be described as "both the information (Info + NOT) and the misinformation (Info) light up and you may forget about the NOT tag." This is supported by findings that positive correction (Jim is tidy) works better than negative correction (Jim is not messy). A third one suggests that people may not like to be told what to think by an authority person, which I think conflicts with what was said above about the "Doctor this" and "Ph.D that".
Right, now on to what may actually help people be corrected:
Note that's all about retractions, something you'll have a hard time to find in creationist circles. Even so, it might explain why many creationists still stick to arguments even AiG deems wrong.
I'd say there's fat chance you'll have creationists preceding their information with warning labels à la "attention, this might turn out to be wrong". Interestingly, both b) and c) are already practised from the evolution side of the "argument", but then again creationists routinely get... "vaccinated" against that, so it shouldn't be a great surprise. Also, it should be noted that b) may prove ineffective due to "methinks doth protest too much" and the intended correction backfires, thus strengthening the misinformation.
BTW, another short quote I loved: "Warnings may induce a temporary state of skepticism, which may maximize people's ability to discriminate between true and false information. Later in this article, we return to the issue of skepticism and show how it can facilitate the detection of misinformation."
Another problem with explaining evolution and, especially, correcting creationists is that people prefer easy-to-digest information to complex one. I'll get to that later on.
Another thing we already suspected, but that we can now count as "true", is the following: (emphasis mine)
That's exactly what many people have been accusing creationists of for a very long time now.
Here's a graphical summary of the findings:
IMPORTANT NOTICE
Stephan Lewandowsky and John Cook have condensed the whole thing into their Debunking Handbook. <-- Free PDF version!
In any case, here finally is the list of things one can and should do:
1) Identify gaps left by debunking misinformation and fill them with new and correct information. This can be especially difficult with creationists because tearing into their myths leaves such a vast hole in their knowledge but also in their emotional "knowledge". (i.e. "So I can behave like a monkey" and "I can now go and kill people")
2) Use repeated retractions, but be aware of the "overkill" or "protest too much" effect.
3) Emphasize the FACT you wish to highlight, avoid mentioning the MYTH.
4) Warn people that you are about to mention a myth.
5) Use simple and brief arguments. Use simple language and sentence structure. Use clear graphs where appropriate. Use few arguments.
6) As mentioned in 1), identify your audience's worldview. People not fixed in their views will be more receptive.
7) If your argument is worldview-threatening, try focusing on opportunities and potential benefits rather than risks and threats.
8) Make positive corrections (Jim is tidy) instead of negative ones (Jim is not messy). I have to admit, I can't think of a way to implement this into the evolution/creation debate.
9) In line with advice 5): After having used very few arguments (maximum 3!) why the myth may be/is false, give the opponent (or person to be corrected) some sources and let him/her work out a fourth counter-argument to his/her own position him/herself.
10) Although I'd refrain from this for personal/moral reasons: Attack the rational behind the misinformers' argument.
11) Use metaphors and examples that directly relate to the every-day life of the creationist. Creationists use aeroplanes and cars as examples, you can use family trees as metaphors for cladistics. (I will use one in my third post to show exactly what I mean.)
A very brief mention of an article that might be relevant: Michael Shermer (2012) has recently written an article for Scientific American. In it, he suggests that people who believe in one conspiracy theory are more prone to believe in many conspiracy theories. So if you find yourself believing in more than three (what others call) conspiracy theories: Start being skeptical immediately, you might just believe bollocks.
Further reading, recommended: (These articles sounded good from the description in Lewandowsky et al.)
Byrne, S., & Hart, P. S. (2009). The boomerang effect: A synthesis of findings and a preliminary theoretical framework. In C. S. Beck (Ed.), Communication yearbook (Vol. 220, pp. 3-37). Hoboken, NY: Routledge.
Oreskes, N., & Conway, E. M. (2010). Merchants of doubt. London, England: Bloomsbury.
Paluck, E. L. (2009). Reducing intergroup prejudice and conflict using the media: A field experiment in Rwanda. Journal of Personality and Social Psychology, 96, 574-587.
Taber, C. S., & Lodge, M. (2006). Motivated skepticism in the evaluation of political beliefs. American Journal of Political Science, 50, 755-769.
Thaler, R. H., & Sunstein, C. R. (2008). Nudge: Improving decisions about health, wealth, and happiness. New Haven, CT: Yale University Press.
References:
Lewandowsky, S., Ecker, U. K. H., Seifert, C. M., Schwarz, N., & Cook, J. (2012) Misinformation and Its Correction: Continued Influence and Successful Debiasing. Psychological Science in the Public Interest, 13 [3], 106-131 DOI: 10.1177/1529100612451018
Shermer, M., (2012) Conspiracy Contradictions: Why people who believe in one conspiracy are prone to believe others. Scientific American, Sept. 2012, P. 77
EDIT: Edited 24.09.2012, 17:43 to include suggestion 11. Everything else remains untouched.
WARNING: The following text is written by a layman and as such is likely to contain errors. Said errors will hopefully be corrected later on.
Observation: People of the general public are often wrong about scientific issues. (e.g. Global warming, unwarranted trust in alternative medicine, fears regarding vaccination, creationism and so on) When trying to correct people on their mistakes, they usually believe them even more.
An example from personal testimony: I discussed the irrational beliefs of my aunt and her family (aunt, uncle, male cousin 1, male cousin 2, female cousin) with my male cousin 1 (from now on referred to only as "cousin"). One of their beliefs is that Erich von Däniken is (at least partially) right about UFO's and extra-terrestrials. In fact, my cousin staunchly believes not only that Aliens are possible (as do I) or that they may have landed in the past (as I do not), but in fact that they are on earth as we speak and that the government is actively trying to hush up their existence. Part of his "evidence" comes from von Däniken.
So one dreary afternoon, I tried to convince him that von Däniken was not only wrong and a liar, but that he even admitted to fabricating evidence. I showed him the fabrications, I showed him articles from notable scientists who disagreed with von Däniken and exposed him, and I even showed him quotes from von Däniken admitting to fabricating them.
You have one try to get this right: Did he change his mind, yes or no?
No, of course not. Evidence doesn't get in the way of ones beliefs, we know that from creationists. In fact, he defended von Däniken all the more after I showed him the quotes.
Problem: Now to get in a quote that's often misattributed to Stephen Hawking but is actually from Daniel J. Boorstin: "The history of Western science confirms the aphorism that the great menace to progress is not ignorance but the illusion of knowledge."
What we have is a classic example of a whole number of cognitive biases stopping an individual from reaching logical and scientifically sound conclusions.
Aim: Now based on the above, I'd like to do the following: Using research from psychology but also using posts from blogs, come up with a number of suggestions that can help people argue more effectively and counteract the cognitive biases.
This is a work in progress, any corrections and suggestions are of course welcome.
As has already been pointed out on multiple occasions, there is a truly awesome series on YouTube called Psychology of belief. It is suggested you acquaint yourself with it before reading on.
The following is largely based on Lewandowsky et al. (2012). The gist of the whole thing can be found either here or, obviously, below.
By the way, Lewandowsky has some AMAZING stories online for public viewing. LINK
In the study, Lewandowsky et al. take us back through well-known truth-denials such as Obama's citizenship and the MMR vaccine wave-of-fear. A very interesting one is the Listerine false-claims retraction campaign. Basically, Listerine (a mouthwash) was claimed to cure colds, although it obviously couldn't. The FTC then told the company to stop spreading BS and retract the claim in an ad-campaign worth $ 10.2 million. They did just that, but a few years later, 57% of consumers still cited a key factor in buying Listerine was because it could cure colds. (compare: only 15% of consumers of a competing product)
Basically, getting rid of misinformation is really, really difficult.
An important notice:
Reliance on misinformation differs from ignorance, which we define as the absence of relevant knowledge. Ignorance, too, can have obvious detrimental effects on decision making, but, perhaps surprisingly, those effects may be less severe than those arising from reliance on misinformation. Ignorance may be a lesser evil because in the self-acknowledged absence of knowledge, people often turn to simple heuristics when making decisions. Those heuristics, in turn, can work surprisingly well, at least under favorable conditions. For example, mere
familiarity with an object often permits people to make accurate guesses about it (Goldstein & Gigerenzer, 2002; Newell & Fernandez, 2006). Moreover, people typically have relatively low levels of confidence in decisions made solely on the basis of such heuristics (De Neys, Cromheeke, & Osman, 2011; Glà¶ckner & Brà¶der, 2011). In other words, ignorance rarely leads to strong support for a cause, in contrast to false beliefs based on misinformation, which are often held strongly and with (perhaps infectious) conviction. For example, those who most vigorously reject the scientific evidence for climate change are also those who believe they are best informed about the subject (Leiserowitz, Maibach, Roser-Renouf, & Hmielowski, 2011).
I haven't put the research in the references below, but if you want to I can tell you exactly which papers they cite.
Now I think that paragraph on its own is, although many of us may already have suspected or know, very powerful. It tells us that creationists are probably also operating under the Dunning-Krueger-effect and they reject science all the more vigorously because of it. It would be easier to teach someone from scratch than have the person drop his/her beliefs and THEN learn it. I think it was AronRa who once said something similar to "I wouldn't be able to teach you because I'd first have to drive all the nonsense out of you first."
Another interesting, though not necessarily novel, tidbit is that emotionally evocative news or rumours are more likely to spread than neutral one. This is also what The Herb Garden Germination (Big Bang Theory) is about. What's more interesting though is that people will actually extract "knowledge" from sources they know are fiction and that even warnings aren't effective.
Again, a quote:
Marsh and Fazio (2006) reported that prior warnings were ineffective in reducing the acquisition of misinformation from fiction, and that acquisition was only reduced (not eliminated) under conditions of active on-line monitoring,when participants were instructed to actively monitor the contents of what they were reading and to press a key every time they encountered a piece of misinformation (see also Eslick, Fazio, & Marsh, 2011). Few people would be so alert and mindful when reading fiction for enjoyment.
So next time you read Harry Potter or watch Star Trek, keep a buzzer at the ready!
And if you want to laugh at Fox News watchers, do so now:
Stephen Kull and his colleagues (e.g., Kull et al., 2003) have repeatedly shown that the level of belief in misinformation among segments of the public varies dramatically according to preferred news outlets, running along a continuum from Fox News (whose viewers are the most misinformed on most issues) to National Public Radio (whose listeners are the least misinformed overall).
There are, of course, a myriad of things a deceitful person might do to hide the bollocks they're peddling: Print in high-color contrast, present it rhymed, deliver in familiar accent and print in easy-to-read font. (Comic Sans?!? Maybe that's why it's so popular!) Interestingly, the creationist tactic of "I'm a Doctor, Ph.D" seems to be sound: Expert testimony is always more persuasive and names (Exxon, BP, etc.) also give a story more credibility, even if the company has vested interest. (e.g. a climate-change is wrong study by Exxon)
What was new to me was that even a single person repeating a rumour multiple times can increase the chance of a person believing it.
"In a very real sense, a single repetitive voice can sound like a chorus."
An interesting add-on to that:
The extent of pluralistic ignorance (or of the false-consensus effect) can be quite striking: In Australia, people with particularly negative attitudes toward Aboriginal Australians or asylum seekers have been found to overestimate public support for their attitudes by 67% and 80%, respectively (Pedersen, Griffiths, & Watt, 2008). Specifically, although only 1.8% of people in a sample of Australians were found to hold strongly negative attitudes toward Aboriginals, those few individuals thought that 69% of all Australians (and 79% of their friends) shared their fringe beliefs. This represents an extreme case of the false-consensus effect.
Very striking. This might explain why creationists often suggest that "many scientists" are abandoning evolution.
Right, so on to "correcting misinformation". Multiple studies were done on neutral events (random building burns down, black smoke caused by improperly stored oil paints and gas cylinders) followed by a retraction in one group (there were no oil paints/gas cylinders) and no retraction in the control group. Depending on the study, either half of the people correct themselves or none at all.
I think that claim is properly outrageous, even though Lewandowsky et al. cite sixteen (16) different studies that did just that, so it evidently is true. Here's another quote, just to show just how crazy our mind is:
More recent studies (Seifert, 2002) have examined whether clarifying the correction (minimizing misunderstanding) might reduce the continued influence effect. In these studies, the correction was thus strengthened to include the phrase "paint and gas were never on the premises." Results showed that this enhanced negation of the presence of flammable materials backfired, making people even more likely to rely on the misinformation in their responses.
On to the mental models: One can be summed up with "people want to have some explanation, no matter how false, rather than none at all". That doesn't explain why people stick to misinformation even after that has been corrected. A second one could be described as "both the information (Info + NOT) and the misinformation (Info) light up and you may forget about the NOT tag." This is supported by findings that positive correction (Jim is tidy) works better than negative correction (Jim is not messy). A third one suggests that people may not like to be told what to think by an authority person, which I think conflicts with what was said above about the "Doctor this" and "Ph.D that".
Right, now on to what may actually help people be corrected:
To date, only three factors have been identified that can increase the effectiveness of retractions: (a) warnings at the time of the initial exposure to misinformation, (b) repetition of the retraction, and (c) corrections that tell an alternative story that fills the coherence gap otherwise left by the retraction.
Note that's all about retractions, something you'll have a hard time to find in creationist circles. Even so, it might explain why many creationists still stick to arguments even AiG deems wrong.
I'd say there's fat chance you'll have creationists preceding their information with warning labels à la "attention, this might turn out to be wrong". Interestingly, both b) and c) are already practised from the evolution side of the "argument", but then again creationists routinely get... "vaccinated" against that, so it shouldn't be a great surprise. Also, it should be noted that b) may prove ineffective due to "methinks doth protest too much" and the intended correction backfires, thus strengthening the misinformation.
BTW, another short quote I loved: "Warnings may induce a temporary state of skepticism, which may maximize people's ability to discriminate between true and false information. Later in this article, we return to the issue of skepticism and show how it can facilitate the detection of misinformation."
Another problem with explaining evolution and, especially, correcting creationists is that people prefer easy-to-digest information to complex one. I'll get to that later on.
Another thing we already suspected, but that we can now count as "true", is the following: (emphasis mine)
This interaction between belief and credibility judgments can lead to an epistemic circularity, whereby no opposing information is ever judged sufficiently credible to overturn dearly held prior knowledge. For example, Munro (2010) has shown that exposure to belief-threatening scientific evidence can lead people to discount the scientific method itself: People would rather believe that an issue cannot be resolved scientifically, thus discounting the evidence, than accept scientific evidence in opposition to their beliefs. Indeed, even high levels of education do not protect against the worldview-based rejection of information; for example, Hamilton (2011) showed that a higher level of education made Democrats more likely to view global warming as a threat, whereas the reverse was true for Republicans.
That's exactly what many people have been accusing creationists of for a very long time now.
Here's a graphical summary of the findings:
IMPORTANT NOTICE
Stephan Lewandowsky and John Cook have condensed the whole thing into their Debunking Handbook. <-- Free PDF version!
In any case, here finally is the list of things one can and should do:
1) Identify gaps left by debunking misinformation and fill them with new and correct information. This can be especially difficult with creationists because tearing into their myths leaves such a vast hole in their knowledge but also in their emotional "knowledge". (i.e. "So I can behave like a monkey" and "I can now go and kill people")
2) Use repeated retractions, but be aware of the "overkill" or "protest too much" effect.
3) Emphasize the FACT you wish to highlight, avoid mentioning the MYTH.
4) Warn people that you are about to mention a myth.
5) Use simple and brief arguments. Use simple language and sentence structure. Use clear graphs where appropriate. Use few arguments.
6) As mentioned in 1), identify your audience's worldview. People not fixed in their views will be more receptive.
7) If your argument is worldview-threatening, try focusing on opportunities and potential benefits rather than risks and threats.
8) Make positive corrections (Jim is tidy) instead of negative ones (Jim is not messy). I have to admit, I can't think of a way to implement this into the evolution/creation debate.
9) In line with advice 5): After having used very few arguments (maximum 3!) why the myth may be/is false, give the opponent (or person to be corrected) some sources and let him/her work out a fourth counter-argument to his/her own position him/herself.
10) Although I'd refrain from this for personal/moral reasons: Attack the rational behind the misinformers' argument.
11) Use metaphors and examples that directly relate to the every-day life of the creationist. Creationists use aeroplanes and cars as examples, you can use family trees as metaphors for cladistics. (I will use one in my third post to show exactly what I mean.)
A very brief mention of an article that might be relevant: Michael Shermer (2012) has recently written an article for Scientific American. In it, he suggests that people who believe in one conspiracy theory are more prone to believe in many conspiracy theories. So if you find yourself believing in more than three (what others call) conspiracy theories: Start being skeptical immediately, you might just believe bollocks.
Further reading, recommended: (These articles sounded good from the description in Lewandowsky et al.)
Byrne, S., & Hart, P. S. (2009). The boomerang effect: A synthesis of findings and a preliminary theoretical framework. In C. S. Beck (Ed.), Communication yearbook (Vol. 220, pp. 3-37). Hoboken, NY: Routledge.
Oreskes, N., & Conway, E. M. (2010). Merchants of doubt. London, England: Bloomsbury.
Paluck, E. L. (2009). Reducing intergroup prejudice and conflict using the media: A field experiment in Rwanda. Journal of Personality and Social Psychology, 96, 574-587.
Taber, C. S., & Lodge, M. (2006). Motivated skepticism in the evaluation of political beliefs. American Journal of Political Science, 50, 755-769.
Thaler, R. H., & Sunstein, C. R. (2008). Nudge: Improving decisions about health, wealth, and happiness. New Haven, CT: Yale University Press.
References:
Lewandowsky, S., Ecker, U. K. H., Seifert, C. M., Schwarz, N., & Cook, J. (2012) Misinformation and Its Correction: Continued Influence and Successful Debiasing. Psychological Science in the Public Interest, 13 [3], 106-131 DOI: 10.1177/1529100612451018
Shermer, M., (2012) Conspiracy Contradictions: Why people who believe in one conspiracy are prone to believe others. Scientific American, Sept. 2012, P. 77
EDIT: Edited 24.09.2012, 17:43 to include suggestion 11. Everything else remains untouched.