Join CIPR
In a dimly lit office a white middle-aged woman with shoulder-length grey hair points at a computer screen with her left hand while writing with her other hand. She is wearing a grey blazer.
Marco VDM / iStock
LEARNING
Friday 20th June 2025

Public affairs in the digital age

What public affairs professionals need to know about the power of social media, data and AI in the modern world.

“We shape our tools and, thereafter, our tools shape us.” — Marshall McLuhan, leader in media theory and author of Understanding Media

The modern global landscape is marked by increasingly complex challenges. From the influence of social media in the 2016 UK European Union membership referendum to the power of anonymity in online extremism, digital communications and technology have an increasing reach and impact, significantly influencing global politics, security and stability.

Communications’ influence in public affairs is far from new. Governments in the first and second world wars regularly used media to influence conflict and political outcomes, but with the rise of social media, messaging has a greater reach and impact. Media used to be centralised and heavily state-controlled – such as with the Ministry of Information (MOI), which regulated wartime communication and produced propaganda – but social media has shifted this power, allowing users to bypass traditional gatekeepers, offering real-time reactions and stories directly from the source. To a certain extent, anyone’s voice can be heard, and the speed at which events are reported can challenge state attempts to control the narrative. So, what could possibly go wrong?

The thing people often forget about social media is that it isn’t owned, it’s shared. Far from a cooperative, tech companies (Meta, X, TikTok etc) control these sites, deciding what content gets seen, suppressed, or banned. These are the new gatekeepers, and they are funded by our data.

  • “The world’s most powerful resource is no longer oil, but data” — The Economist

There is often a lack of transparency around how these companies collect and use data. A 2019 survey found that 74% of U.S. Facebook users were unaware that the platform tracks their online behaviour for ad targeting purposes. This lack of public awareness about data collection and use is not just convenient but often highly profitable for tech companies. Meanwhile, it can impact users’ privacy, and manipulate their behaviour through ‘microtargeting’ – such as personalised political ads, which some believe undermine the democratic process – and in some cases lead to state surveillance or repression.

McLuhan’s Understanding Media describes how the medium often impacts societal behaviour more than the content, as it shapes our consumption habits. Social media is intentionally designed to be addictive. Unsurprisingly, we have seen a decrease in engagement in traditional, more legitimate news sources in younger generations, as they rely on social media for information, despite fearing misinformation and manipulation of their data. This lack of engagement with verified news sources could have serious consequences regarding the spread of misinformation, including a rise in public distrust and polarisation. With 70% of the world’s population as social media users, this has global implications for public affairs. 

  • “A public that does not know whom to trust cannot be reliably informed.”— Shoshana Zuboff, The Age of Surveillance Capitalism

The infamous social media algorithm is also problematic, as it can influence users’ emotions and beliefs. In a 2019 study by Fazio, Rand, and Pennycook, it was discovered that repeated exposure to content, which is likely due to the social media algorithm that pushes similar content the user has previously engaged with, can create an “illusory truth effect”, where both plausible and implausible statements are judged as more truthful when repeated. This repetition increases the perceived truth of statements regardless of their plausibility, meaning that repeated exposure to information can shape public perception and influence political outcomes. 

Furthermore, emotional and moral content captures user attention better, and will therefore be boosted even further by the algorithm. This type of messaging, particularly portraying anger and disgust, has been shown to be more effective at impacting users’ information processing and influencing their beliefs, attitudes and political behaviour. These negative emotions tend to be used more by populist parties, giving them the edge online.

In psychology, Professor Paul Gilbert’s The Compassionate Mind describes how these negative emotions can trigger passions and motives of our “old brain” (impulsive/emotional mind), which hijacks our “new brain” (rational mind). “When it does that, we simply find ways to satisfy those desires or find reasons for feeling what we feel, supporting our prejudices.” This lack of awareness can have dire implications, as people are often blind to the impact of their actions or the hidden motivations behind them. Although “through no fault of our own,” Gilbert urges us to “open our eyes to the ease with which we can become deluded and not see the realities we are creating around us." 

Humankind can be creative, innovative and inspiring, but our brains have certain ‘quirks’ which can leave us vulnerable. While many of us believe we know how we work and why we think what we do, we are largely unaware of this cognitive dilemma ― a new study has shown that while 95% of people consider themselves self-aware, only 10 to 15% actually are (if you don't believe it, try taking one of the Harvard Implicit Association Test that highlights your unconscious bias... you might be surprised). It appears that despite humanity's increasingly impressive advancements in technology and artificial intelligence, we are still lacking the same awareness of ourselves, falling victim to our most basic flaws and instincts, but now at a much greater, faster and more dangerous scale. 

  • “We are dangerous when we are not conscious of our responsibility for how we behave, think, and feel.” ― Marshall B. Rosenberg, Nonviolent Communication: A Language of Life

Artificial intelligence might be one of the world's hottest contemporary topics, but emotional intelligence ― the ability to understand, manage and respond to our emotions ― is the key to combatting our “old brain” impulses, easily provoked in our modern and hyperconnected world. By understanding our cognitive limitations and developing skills such as self-awareness, self-regulation and empathy, we can mitigate the impact of our "brain misfires" and train our minds to have better resilience, manage emotions more effectively, and address our unconscious thinking traps. This is essential if we want to build a world that works for us instead of inflaming our outdated desires and prejudices.

McLuhan describes technology as an extension of humankind, which can amplify our strengths and weaknesses. The speed of these changes is incredible, but Facebook’s slogan "move fast and break things" has never felt so apt. We must consider that, in this new world, a comprehensive, ethical and emotionally informed understanding of our evolving digital and technological landscape is an essential partner to progress, particularly in the fields of communications, human rights and public affairs. Understanding ourselves is integral to moving forward with awareness and compassion, and combatting the blind spots of human nature, often augmented by technology. The future may be artificial intelligence, but our emotional intelligence will be the real decider of how tomorrow will unfold. 

A black and white portrait of Amelia Stokes. Amelia is a white woman with dark hair who is smiling at the camera.

Amelia Stokes is a chartered public relations practitioner. Influence previously offered her reflections on the PR industry as part of our CIPR member spotlight series.