Join CIPR
PUBLIC RELATIONS
Wednesday 2nd October 2019

This is Bulls**t! Retrain your critical thinking in a misinformed world

The term ‘truthiness’ was first popularized but the American comedian Stephen Colbert in 2005 to describe the "truth that comes from the gut, not from the book". It started as a reaction to George W Bush’s decision-making and the public perception of his thinking, but it soon became clear that the concept could be applied to many situations. Now it has sparked serious scientific research.

By David Robson,

Norbert Schwarz and Eryn Newman have led much of this work. According to them, truthiness comes from two particular feelings: familiarity (whether we feel that we have heard something like it before) and fluency (how easy a statement is to process).

Importantly, most people are not even aware that these two subtle feelings are influencing their judgement — yet they can nevertheless move us to believe a statement without questioning its underlying premises or noticing its logical inconsistencies.

As a simple example, consider the following question from some of Schwarz’s earlier studies of the subject: How many animals of each kind did Moses take on the Ark?

The correct answer is, of course, zero.

Moses didn’t have an ark — it was Noah who weathered the flood. However, even when assessing highly intelligent students at a top university, Schwarz found that just 12% of people registered that fact.

The problem is that the question’s phrasing fits into our basic conceptual understanding of the Bible, meaning we are distracted by the red herring — the quantity of animals — rather than focusing on the name of the person involved.

Two by Two Equals Five

When we look at media and communications today, what’s shocking is how easy it is to manipulate these two cues — familiarity and fluency — with simple changes to presentation so that we miss crucial details.

In one experiment, Schwarz found that people are more likely to fall for the Moses illusion if that statement is written in a pleasant, easy-to-read font — making the reading more fluent — than in an uglier, italic script that is harder to process.

For similar reasons, we are also more likely to believe people talking in a recognisable accent than those whose speech is harder to understand, and we place our trust in online vendors with easier-to-pronounce names, irrespective of their individual ratings and reviews by other members.

Sometimes, increasing a statement’s truthiness can be as simple as adding an irrelevant picture.

Repeat After Me

In one rather macabre experiment from 2012, Newman showed her participants statements about a series of famous figures — such as a sentence claiming that the singer Nick Cave was dead. When the statement was accompanied by a stock photo of him, they were more likely to believe that the statement was true, compared with participants who saw only the plain text.

Perhaps the most powerful strategy to boost a statement’s truthiness is simple repetition. In one study, Schwarz’s colleagues handed out a list of statements that were said to come from members of the National Alliance Party of Belgium (a fictitious group invented for the experiment). But in some of these documents, there appeared to be a glitch in the printing, meaning the same statement from the same person appeared three times.

Despite the fact that it was clearly providing no new information, the participants reading the repeated statement were subsequently more likely to believe that it reflected the consensus of the whole group.

To make matters worse, the more we see someone, the more familiar they become, and this makes them appear to be more trustworthy.

A liar can become an ‘expert’, and a lone voice begins to sound like a chorus, just through repeated exposure. These strategies have long been known to purveyors of misinformation.

"The most brilliant propagandist technique will yield no success unless one fundamental principle is borne in mind constantly and with unflagging attention," Adolf Hitler noted in Mein Kampf. “It must confine itself to a few points and repeat them over and over."

Inoculate Yourself

This new understanding of misinformation has been the cause of serious soul-searching among organisations attempting to spread the truth.

In The Debunking Handbook, John Cook, then at the University of Queensland, and Stephan Lewandowsky, then at the University of Western Australia, offer some solutions. For one thing, organisations hoping to combat misinformation should ditch the ‘myth-busting’ approach, in which they emphasise the misconception and then explain the facts.

A cursory glance at the NHS web page on vaccines, for instance, lists the 10 myths, in bold, right at the top of the page. They are then repeated again, as bold headlines, underneath.

According to the latest cognitive science, this kind of approach places too much emphasis on the misinformation itself: the presentation means that it is processed more fluently than the facts, and the multiple repetitions simply increase its familiarity.

Schwarz is sceptical about whether we can protect ourselves from all misinformation through mere intention and goodwill, though: the sheer deluge means that it could be very difficult to apply our scepticism even-handedly.

When it comes to current affairs and politics, for example, we already have so many assumptions about which news sources are trustworthy — whether it's The New York Times, Fox News, Breitbart or your uncle — and these prejudices can be hard to overcome.

In the worst-case scenario, you may forget to challenge much of the information that agrees with your existing point of view and only analyse material you already dislike. As a consequence, your well-meaning attempts to protect yourself from bad thinking may fall into the trap of motivated reasoning.

Even so, there is now some good evidence that we can bolster our defences against the most egregious errors while perhaps also cultivating a more reflective, wiser mindset overall. Strategies often come in the form of an ‘inoculation’ — exposing us to one type of bullshit, so that we will be better equipped to spot other forms in the future. The aim is to teach us to identify some of the warning signs, planting little red flags in our thinking, so that we automatically engage our analytical, reflective reasoning when we need it.

Cook and Lewandowsky’s work suggests this approach can be very powerful. In 2017, they investigated ways to combat some of the misinformation around man-made climate change — particularly the attempts to spread doubt about the scientific consensus.

Lessons from History

However, rather than tackling climate change myths directly, they first presented some of their participants with a fact sheet about the way the tobacco industry had used ‘fake experts’ to cast doubts on scientific research linking smoking to lung cancer.

They then showed them a specific piece of misinformation about climate change: the so-called Oregon Petition, which claimed to offer 31,000 signatures from people with science degrees, who all doubted that mankind’s greenhouse gas emissions are disrupting the Earth’s climate.

After learning about the tobacco industry’s tactics, most of Cook’s participants were more sceptical of the climate change misinformation, and it failed to sway their overall opinions.

Even more importantly, the inoculation had neutralised the effect of the misinformation across the political spectrum; the motivated reasoning that so often causes us to accept a lie, and thereby reject the truth, was no longer playing a role.

Equally exciting is the fact that the inoculation concerning misinformation in one area (the link between cigarettes and smoking) provided protection in another (climate change). It was as if participants had planted little alarm bells in their thinking, helping them to wake up and apply their analytical minds more effectively, rather than simply accepting any information that felt ‘truthy’.

This is an edited extract from The Intelligence Trap: why smart people do stupid things and how to make wiser decisions by David Robson (Hodder & Stoughton, 2019).

Robson is a commissioning editor for BBC Future.




A version of this article was first published in Influence magazine, Q3 2019.

Featured image courtesy of flickr user Rose Davies via CC2.0