Join CIPR
PUBLIC RELATIONS
Monday 17th October 2022

How to deal with mis/disinformation attacks: Understanding the information disorder

In 2020, when the COVID-19 pandemic struck, the terms ‘misinformation’ and ‘disinformation’ entered the public discourse and proliferated on social media platforms. From misinterpreted information to conspiracy theories, information pollution in the online and offline ecosystem has brought unprecedented challenges for crisis communications and communications teams.  

Misinformation is false information that is spread regardless of any intent to cause harm. On the contrary, disinformation is false, inaccurate, twisted information that is deliberately spread by an individual, group, organisation or authorities with a clear intent to cause harm to the one targeted. However, it is also not that simple and we need to consider the health of the entire online information ecosystem. False information is often set right beside facts, creating a confused and polluted information environment.

These tongue twisters were initially considered a problem faced by the people living in the global north and the conversations mostly centered on COVID-19, medicines, treatment and later on the side effects of vaccines.  Soon we realized that this infodemic knows no borders. For instance, a fake post/tweet/WhatsApp of a Chinese woman eating bats was shared widely across the world in multiple languages. Similarly, a conspiracy theory relating to COVID-19 vaccines became viral in the United States. A couple of months later, the same theory was spreading like wildfire in the western part of Africa.

It may look new age and modern, but mis-/disinformation has its roots in the past. Time and again, these techniques have been used in various forms such as propaganda, fake news, conspiracy theories, and even well-orchestrated attacks – both online and offline – to target and influence populations for different objectives.

For humanitarian organisations working in critical contexts, mis-/disinformation attacks in any form can translate into real life harm: during my own experience of working in the sector, I have witnessed many such attacks against the staff and the communities we work with.

Here are the steps which I generally follow while addressing a mis-/disinformation attack. For ease of reference, I will refer to mis-/disinformation, hate speech, trolling etc. collectively as ‘Information Disorder’.

Create a good understanding of mis-/disinformation

It is important for not just the crisis communications team but also decision-makers and other key stakeholders to have a good understanding of the potential risks and harms related to mis-/disinformation, as otherwise it is possible that cases will not be reported in a timely manner or effectively prioritised. A good understanding of risks and harms ensures that cases reach the crisis communications team in good time and that the management is on board with the solution suggested. So, cut the jargon and try to make the issue relatable to your colleagues.

Assessment of the case

If a case of mis-/disinformation has been reported to you, do an initial assessment to assess whether the information disorder is worth your attention. It is important to assess if the attack is targeting the organisation, its staff, or the associated stakeholders. We cannot address every piece of mis-/disinformation on the internet. Take a multi-disciplinary approach to handle disinformation, looking at the legal, security, operations, communications and community engagement dimensions.

Monitoring the conversations

Social listening – online and offline – helps in collection of data and conversations floating in the information ecosystem. Understand the flow, the sources, and the platforms where the information disorder is present. You can use tools like Talkwalker, Crowdtangle, Tweetdeck, Google Alerts, Spike, and a manual Boolean Search to create dashboards and alerts for the team. This helps in ensuring that the team does not lose track of the conversations and the case escalation.

Digital verification

In a polluted information environment, it is difficult to differentiate between the facts and lies. Digital Verification can fetch important details, which can be used to understand and then tackle mis-/disinformation. Anyone with a good understanding of digital communication can learn practical skills such as verification, and cross-check images, videos and text using the tools listed in Bellingcat’s Digital Verification Toolkit.

Classify the type of information

Information disorder takes many forms and unfortunately is not as simple as just misinformation and disinformation. First Draft (now Information Futures Lab) has done a lot of research on this topic and has suggested seven-types of mis-/disinformation – Satire/Parody, False Connection, Misleading Content, False Content, Imposter Content, Manipulated Content, and Fabricated Content. The classification may differ a little for your organisation based on the contexts and experiences. 

Create defined workflows

Given the highly volatile nature of mis-/disinformation, such attacks require a swift response, else they can turn into a disaster. In a situation of crisis, chaos, confusion, delayed approvals can become the biggest barriers to effective tackling of the attack. Decide who will be part of the response team and ensure workflows are in place to ensure timely support and approvals.

Evaluate the threat at hand

Every case of mis-/disinformation is unique and requires balanced human judgement to deal with it. However, keeping in mind the basics of response techniques, the level of risk – low, medium, high, and urgent – can be ascertained by evaluating it on five criteria points:

Gravity/Intent to cause harm – What is the direct risk of a real harm?

Scale/Virality – How viral is the mis-/disinformation?

Novelty – Is it a new attack or have we seen something like this already?

Target – Who is the target? Organisation, staff, or someone else?

Source – Who is spreading this content? Is the person influential?

You can add other verification techniques to better ascertain the risk involved.

Respond to the attack

Now that you have reached this step, you will have an informed classification and threat level evaluation. For low-risk attacks, you can pick content moderation, monitoring, or basic reporting (in case of online attacks) of the profile spreading the information disorder. When faced with a high-risk mis-/disinformation, you can take the legal route or send escalation requests to the platform or even suggest a debunking campaign to counter that.

1 - Ignore

2 - Hide/delete comment

3 - Monitor the conversations

4 - Report the content/profile

5 - Verify claims

6 - Seek legal support

7 - Escalate it to the platform

8 - Run a debunking campaign

9 - Hold a press briefing with the journalists

This is not a set-in-stone checklist as different situations may require you to take steps that are out of this list.

Debrief and document

Now the case has been tackled and you are ready to take a well-deserved break. But the work is not over yet. A well-documented debriefing of the lessons and challenges can help you build a good database of cases and will save a lot of time when a similar attack reaches your desk the next time. It is always helpful to create a logbook and ‘SitReps’ (otherwise known as Situation Reports – documents which give a quick understanding of the issue and its current status) as they can be good reference points for future investigations.

Pre-bunking – the underdog

The effectiveness of this technique is always underestimated. However, including pre-bunking in your regular communications strategies can, in many cases, help the teams in avoiding the brewing of mis-/disinformation in a particular context. As per this technique, one should identify, assess, and fill the relevant and important information voids/gaps, which can be easily used against the organisation. It is important to listen to the questions your followers/audience is asking you and fill those gaps with correct and relevant details.

Capacity building

A bigger organisation can afford to have a proper crisis communications team but this might not be the case for a small NGO. In the humanitarian sector, you will often find one person handling crisis management for multiple projects, that too with limited resources.  Therefore, it is essential to conduct training for the people on the frontlines of mis-/disinformation and equip them with knowledge, so they can tackle the lower and medium-risk cases themselves. In addition, produce easy to follow guidelines and resources which work in tandem with the existing crisis management systems.

Once applied on a regular basis, these steps can be made part of your regular crisis management workflows and will moreover prepare you for any contingencies. However, there is never a foolproof solution. Information disorder is here to stay and its impact is already becoming more visible to the world, so it is high time we dedicate resources to tackle it.

Divya Pushkarna is Disinformation Adviser at  Doctors Without Borders (MSF)

‘How to deal with mis/disinformation attacks: Understanding the information disorder’ was originally posted by the CIPR Crisis Communications Network

Read the original post.

Image by PerlaStudio on iStock