by Matthew Warren edited by O Society September 12, 2019
If you hear an unfounded statement often enough, you might just start believing it’s true. This phenomenon, known as the “illusory truth effect,” is exploited by politicians and advertisers — and if you think you are immune to it, you’re probably wrong.
Earlier this year we reported on a study that found people are prone to the effect regardless of their particular cognitive profile.
This doesn’t mean there’s nothing we can do to protect ourselves against the illusion. A study in Cognition finds using our own knowledge to fact-check a false claim can prevent us from believing it is true when it is later repeated. However, we might need a bit of a nudge to get there.
The illusory truth effect stems from how we process repeated statements more fluently: we mistake a feeling of fluency for being a signal a statement is true.
And the effect occurs even when we should know better; which is to say when we repeatedly hear a statement we know is wrong.
For instance, take the statement “The fastest land animal is the leopard.”
Nadia Brashier at Harvard University and colleagues wondered whether asking people to focus on the accuracy of a statement could encourage us to use our knowledge instead, and avoid relying on mistaken feelings of fluency.
In the initial study, the team first asked 103 participants to read 60 widely-known facts, some of which are true (e.g. “The Italian city known for its canals is Venice”) and some of which are false (e.g. “The planet closest to the sun is Venus.”)
One group rated how interesting each statement is, while the other rated how true it is. Then in the second part of the study, both groups saw the same 60 statements along with 60 new ones — again a mixture of true and false — and rated their truthfulness.
The researchers found participants who focuss on how interesting the statements are in the first part of the study showed the illusory truth effect: these people subsequently rate false statements – which they already saw – as more true than the new false statements.
However, the group to initially focus on the accuracy of the statements do not show this effect, rating both new and repeated false statements as equally true.
This finding suggests using our own knowledge to critically analyse a statement when we originally encounter it may inoculate us against the illusory truth effect. This seems to have fairly long-lasting effects: in another experiment, the team finds participants who initially focus on the accuracy of the statements still show no sign of succumbing to the illusory truth effect two days later.
Remember, considering the accuracy of a statement is only useful if we already have appropriate knowledge in our minds (e.g. the closest planet to the sun is Mercury and not Venus).
In further studies, the team finds rating the truthfulness of more obscure false statements – which participants don’t know much about – such as “The twenty-first U.S. president is Garfield,” didn’t later protect against the illusory truth effect.
It would be interesting to know whether fact-checking against external sources – such as internet sites or reference books — which requires more effort than simply using our own knowledge in hand — is effective at combating the illusion in these cases.
Still, simply having the background knowledge needed to counter false claims is not always enough, say the authors. Their results suggest people may need to be “nudged” into actually using our knowledge.
“Education only offers part of the solution to the misinformation crisis; we must also prompt people to carefully compare incoming claims to what they already know.”