Sign In
PUBLIC RELATIONS
Tuesday 19th February 2019

Disinformation and faking the news

By Daniel Gerrella,

On Monday, the Digital, Culture, Media and Sport Committee (DCMS) published Disinformation and 'Fake News': Final Report.

It calls for:

  • The introduction of a compulsory code of ethics for tech companies
  • A requirement for social media companies to remove harmful content, including proven sources of disinformation
  • The creation of an independent regulator to monitor tech companies, with legal powers to act against those breaching the ethical code
  • A reform to electoral communications laws to combat overseas involvement in UK elections

WHAT IS DISINFORMATION?

DCMS defines disinformation as the 'deliberate creation and sharing of false and/or manipulated information that is intended to deceive and mislead audiences, either for the purposes of causing harm, or for political, personal or financial gain'.

The report points to the ubiquity of social media and its amplification effect as reasons that disinformation is increasing, with news often shared and reposted without fact-checking.

Audiences are starting to agree, with social media the least trusted source of news (34%, compared with 60% for traditional media). However, the report is clear that intervention is still needed.

SOCIAL MEDIA COMPANIES BLAMED

Damian Collins MP, Chair of the DCMS Committee, said that companies like Facebook failed to protect users from harmful content and respect their data privacy rights. This included sharing user’s data without permission, and inconsistent approaches to sharing data between different app developers.

The report blamed tech companies for hiding behind claims that they were merely platforms with no responsibility for regulating content on their site. It has asked the government to consider a new category for them, which would also see them assume legal liability for content.

HOW WILL THE CODE OF ETHICS WORK?

The code of ethics will define harmful content and the regulator will monitor tech companies to ensure that they adhere to it. It will be able to launch legal action if harmful or illegal content is displayed on sites, and those found guilty of hosting the content will face large fines.

As well as making audiences aware that this code exists, the report suggests that wider work needs to be done to increase digital literacy. This would be paid for by a levy imposed on large tech companies and would be designed to help audiences make decisions around how their data is used and how to identify false content.

WHAT ABOUT THE POLITICAL SIDE?

The report was also concerned about the practice of micro-targeted campaigns that were not marked as paid advertising. The report argued that these should be clearly identified, and the source and advertiser referenced. This was partly because evidence was found that adverts were being paid for outside of the UK to influence elections, with the aim of creating disharmony and destabilising democracy.

A further concern was the targeting of adverts based on inferred data, which makes assumptions based on user’s preferences and interests. The report argued that this data should be protected as much as personal information from a privacy perspective.

ALGORITHMS DO NOT HELP

The publication of the report coincides with a separate study that blames YouTube for the rise in 'flat-earthers'. Respondents claimed that they had always believed the earth was round, until they viewed alternative 'evidence' on YouTube.

The reason was partly due to the algorithms behind YouTube’s search and recommendations. These algorithms are designed to personalise the experience for users based on their activity and history. The ultimate aim is to get them to spend more time on the platform, driving advertising views.

The research found that once one conspiracy video had been viewed, users would be led 'down the rabbit hole' as similar content is suggested (this thread by Guillaume Chaslot explains how it works in more detail, and the steps YouTube is taking to tackle the problem).

WHAT HAPPENS NEXT?

Clearly, the issues are wide-ranging, and it is good to see such a comprehensive report from DCMS (the full report runs to over 100 pages).

The next stage is for the government to respond, which should happen later this year. This will be in the form of the Online Harms White Paper, which will be authored by DCMS and the Home Office.

Read Original Post

Photo by ROBIN WORRALL on Unsplash