‘To fight disinformation, we need to understand it’

Could there have been a more topical time to focus on social media disinformation? The day after Donald Trump was sworn in (again) as US President, a live online event on the issue took place, co-organised by West Country Voices.

We saw how Elon Musk’s X became a MAGA mouthpiece during Trump’s 2024 election campaign, disseminating damaging untruths about the Democrats to millions of users on X. And on Monday 20 January, there they were, Trump and his far-right-hand man, celebrating at the Capitol, four years after that very building was stormed by an insurrectionist mob of Trump supporters who believed false claims that the 2020 election had been ‘stolen’.

Two days before Trump’s inauguration Musk announced on X: ‘From MAGA to MEGA: Make Europe Great Again!’ We’ve already seen him not only megaphoning support for Reform UK, but also backing far-right parties in Europe. In the UK, he has put considerable effort into attempts to damage (and distract) the Labour government with false accusations of inaction over child sex grooming gangs. Right now, we need a resistance movement against a ramped-up campaign of far-right disinformation in UK politics.

Enter our guest for the online event, Renee diResta. She’s a US-based expert on the way online propaganda and social media mechanics are used by politicians, conspiracy theorists and online rabble-rousers to manipulate public opinion. Her book Invisible Rulers: people who turn lies into reality delves deeply into the subject: how rumours and fake news narratives spread like wildfire online; how algorithms fan the flames; why attempts to debunk fake news online often backfire after being branded ‘censorship’.

DiResta was invited by Westcountry Voices, The Movement Forward and Open Britain to appear via live video link from the US to answer questions and share her advice on countering disinformation. The session, ‘Importing Disinformation: How can the UK defend trust, truth and democracy?’ on Tuesday 21 January attracted an online audience of nearly 800 people and the recording is now available to view on YouTube.

In a session lasting just one hour, DiResta could only skim the surface of an incredibly complex and multi-stranded issue. But she certainly made it clear what we’re up against. Also, rather than concluding that ‘the bad guys are winning’, she shared her hopes for an evolution in social media that will make users less vulnerable to disinformation.

Describing her first foray into the distorted world of online opinion-forming, DiResta took us back to 2015 when she led a social media campaign to counter anti-vax propaganda in California. Large numbers of parents in the US state were opting out of having their children vaccinated against common illnesses because of false claims that the jabs were dangerous.

Investigating the source of these beliefs, she found that Facebook’s recommendation systems were pushing conspiracy theories on to parents whose online activity indicated natural concerns about their children’s health. Some were being targeted with terrifying claims that vaccines caused conditions such as autism and even sudden infant death syndrome (SIDS).

The state health authorities didn’t launch an effective counter-attack because they assumed the public respected their professional credentials, and that their official advice carried more weight than the online scare stories. Some dismissed the anti-vax movement as ‘just some people online’. They didn’t understand that the anti-vaxxers were using emotive personal stories – the kind that people tell their friends and neighbours – to spread distrust of vaccines among an already fearful audience. The health officials were communicating general facts through official channels, and that didn’t hit home with worried parents.

Meanwhile, the pro-vax movement’s attempts to tackle and debunk the scare stories attracted online abuse, personal harassment and accusations of attacks on free speech.

DiResta said:

“I was seeing in 2015 the future of what every divisive political campaign was going to be like, going forward.”

Divisive being the operative word – because controversy and conflict are social media ‘gold’ for political influencers who can’t attract large audiences via TV and radio.

She explained:

“Tools for message dissemination are tools of power, and politicians recognise that. Groups that have traditionally enjoyed the reach of broadcast media tend to think of attention as something that is ‘given to you’ – the media covers you, and that’s how facts get out into the world. Other groups that don’t have that same traditional access to broadcast media really invest in building their own infrastructures of amplification and their own means of capturing attention.”

Fast-forward to the last few years, and we’ve seen the exponential growth of social media content generated by individuals and groups seeking a mandate for racist and inhumane policies. Disinformation about immigrants, asylum seekers, unions, benefit claimants and LGBTQ+ activists is used to stoke up self-righteous hatred among social media users who’re primed to perceive these people a threat to their own values, lifestyle and personal identity.

So how do we fight back? That was the question put to Renee by the CEO of Open Britain, Mark Kieran; his organisation calls for a proportional voting system, the elimination of ‘dark money’ from UK politics and a pushback against the damaging effects of disinformation.

DiResta said:

“Be out there, be visible, be vocal and be counter-speaking as much as possible.”

While acknowledging that algorithms are controlled by the tech platforms and that influencers have their own agendas, she said we should still use the tools and resources at our disposal.

“It is important to be putting out content and to have your message on as many platforms as possible, even if you’re not doing that directly yourself – creating content and enabling people to move it across the ecosystem.

‘We’ve seen many, many activist groups grow on social media by leveraging calls to amplify content.

“Things don’t magically go viral – people take actions and in aggregate and collectively, that’s what creates the signal that the algorithm pushes out to still more people.”

She also encouraged campaigners to appear in online spaces and take part in podcasts where their message will reach a large audience, even if they don’t agree with all the views that tend to be expressed by the host or the more ‘usual’ guest. (GB News might be a good example in the UK). While some platforms are rife with vitriolic abuse and should be avoided, others may offer the potential for common ground, along with new networking opportunities. At the very least, the message may land with some who might not be so wedded to their opposition to progressive ideas and factual information.

Should we also be clamouring for action on the all-powerful algorithms that decide what content appears in people’s feeds and what networks are built around them? Algorithms that play into the hands of the extremists by promoting the posts that attract the strongest reactions, and allow bot accounts to create the illusion of widespread ‘real life’ support and engagement? Should governments impose regulations on the tech platforms – or should tech platforms be required to tweak the algo settings themselves?

DiResta said this is a tricky issue. Do we really want governments to set the rules on social media? Do we really trust the tech corporations to have our best interests at heart? In either case, she said, ‘it’s the user who gets screwed’ when the wrong decisions are made.

“Give more power to the user,” is her suggestion.

Throughout the session, DiResta had emphasised this essential problem: the mechanics of social media are not visible to individual users or online communities. She talked about ‘bespoke realities’ and ‘information environments’ created by the algorithms’ interpretation of users’ personal data and online behaviour. We’ve already seen how people’s emotions, values and beliefs are seen as fair game for manipulators who claim to be ‘defending’ their interests.

Elon Musk’s takeover of X has led to millions of people migrating to other platforms where they have more control over what they see; Bluesky, which enables users to curate their feeds and decide whether to view posts chosen for them by an algorithm, said it had reached 25.9 million users by the end of 2024. There’s also the Fediverse – a global social network of interconnected servers that allows people to communicate across different platforms – Mastodon is one of the best-known examples.

DiResta has welcomed these decentralised alternatives to the established social media platforms. Her parting shot at the end of the live Q&A session was a call to action for anyone wanting to control their exposure to online disinformation.

“It’s about recognising that we do have the capacity – we have agency here,” she said. “There are a lot of places where the dynamics of the incentives of influencers and algorithm amplification of private companies don’t serve us, but I really do feel that I want people to go and explore some of the new and interesting ways that Bluesky and the Fediverse are trying to envision a better world.

“There’s a bunch of ways to get involved and think about how to shape new spaces as well as proactively engage in the ones you’re already in.”

And when it comes to debunking disinformation, let’s not forget the impact of real-time person-to-person conversations outside the social media bubble. We might be talking to a friend, chatting with a neighbour, or meeting members of the public at a campaigning event; these are all spaces where we can challenge online propaganda. In real life our message may not go viral, but we all have a voice.

Find us on BlueSky
Find our YouTube channel