close
close

They were arrested for posting on social media during the riots – will that change anything?

BBC montage: An iPhone with code next to the BBC

For Tyler Kay and Jordan Parlor, justice for what they posted on social media has become swift and severe.

Kay, 26, and Parlour, 28, were sentenced to 38 and 20 months in prison respectively for stoking racial hatred online during the summer unrest.

The charges following the disruption felt like a significant moment in which people faced real-world consequences for what they said and did online.

There was widespread agreement that false claims and online hate contributed to violence and racism on British streets in August. Prime Minister Keir Starmer then said that social media “bears the responsibility” for tackling misinformation.

More than 30 people were arrested over social media posts. According to my findings, at least 17 of them have been charged.

Police will have assumed that some of the people being investigated did not meet the criminal threshold. And in many cases, the legal system could be the wrong way to handle social media posts.

But some posts that didn't cross the line into criminality may still have had real-life consequences. So there is no day of reckoning for those who made them.

And apparently neither for the social media giants, whose algorithms are repeatedly accused of prioritizing engagement over security and distributing content regardless of the response it may provoke.

Getty Images Businessman and investor Elon MuskGetty Images

X owner Elon Musk criticized the British authorities' response to the unrest

At the time of the riots, I had wondered whether this could be the moment that finally changed the online landscape.

Now I'm not so sure anymore though.

To understand the role of the social media giants in all this, it is helpful to first look at the cases of a father in Pakistan and a businesswoman from Chester.

On This false name was then frequently quoted by others.

Another poster who shared the fake name on X was Bernadette Spofforth, a 55-year-old from Chester with more than 50,000 followers. She had previously shared posts raising questions about lockdown and net zero climate action.

Channel3Now and Ms Spofforth's posts also incorrectly suggested that the 17-year-old was an asylum seeker who had come to the UK by boat.

All of this, along with other untrue claims from other sources that the attacker was a Muslim, were widely blamed for contributing to the unrest – some of which targeted mosques and asylum seekers.

I discovered that Channel3Now was linked to a man named Farhan Asif in Pakistan, as well as a hockey player in Nova Scotia and someone who claimed his name was Kevin. The site appeared to be a commercial operation designed to increase views and sell ads.

At that time, a person who claimed to be from Channel3Now management told me that publishing the false name was “a mistake and not intentional” and denied that the name came from him.

And Ms. Spofforth told me she immediately deleted her untrue post about the suspect when she realized it was false. She also strongly denied making up the name.

So what happened next?

Farhan Asif and Bernadette Spofforth were both arrested for these posts not long after I spoke to them.

However, the charges were dropped. Pakistani authorities said they could find no evidence that Mr. Asif was the originator of the false name. Cheshire Police also decided not to charge Ms Spofforth due to “insufficient evidence”.

Mr. Farhan seems to have disappeared. The Channel3Now website and several associated social media pages have been removed.

However, Bernadette Spofforth is now posting regularly on X again. This week alone, her posts were viewed more than a million times.

She says she has become an advocate for free speech since her arrest. She said: “As has now been shown, the idea that a single tweet could be the catalyst for the unrest that followed the atrocities in Southport is simply not true.”

Focusing on these individual cases can provide valuable insight into who is sharing this type of content and why.

But to get to the heart of the problem, you have to take a step back.

While people are responsible for their own posts, I've found time and time again that it's essentially about how different social media sites work.

Decisions made under the tenure of Elon Musk, the owner of X, are also part of the story. These decisions include the ability to purchase blue checkmarks that give your posts more attention, and a new approach to moderation that puts freedom of expression above all else.

The UK's head of counter-terrorism policing, Assistant Commissioner Matt Jukes, told me for the BBC News program that “X was a huge driver” of contributions that contributed to the summer's unrest.

Matt Jukes, deputy commissioner of special operations at Getty Images Getty Images

Matt Jukes has accused X of playing a major role in fomenting the unrest

A team he oversaw called the Internet Referral Unit noticed “the disproportionate impact of certain platforms,” he said.

He says there were about 1,200 tips – posts reported to police by members of the public – in connection with the riots alone. For him, this is “just the tip of the iceberg”. The unit received 13 times more referrals related to X than TikTok.

Taking action on content that is illegal and violates terror laws is, in some ways, the easy part. More difficult to combat are the contributions that fall into what Mr. Jukes calls the “legitimate but terrible” category.

The Department will flag such material on websites where it is posted if it believes it violates their terms and conditions.

But Mr Jukes found Telegram difficult to deal with because it was home to several large groups organizing riots and sharing hate and disinformation.

According to Mr. Jukes, Telegram has an “iron determination not to engage with the authorities.”

Elon Musk has accused law enforcement in the UK of trying to police opinions on issues such as immigration, and there have been allegations of disproportionate action against individual posters.

Mr. Jukes responds: “I would tell Elon Musk if he were here, we wouldn't arrest people because they have an opinion on immigration. [Police] went and arrested people for threatening or inciting others to burn down mosques or hotels.”

But while those who took part in the unrest and posted hateful content online felt responsibility “at the extreme end,” Jukes said that “the people who make billions from providing these opportunities” put harmful content in to publish on social media, “have not really paid any price”.

He wants the online safety law, which comes into force in early 2025, to be strengthened so that it can better deal with content that is “lawful but terrible”.

Telegram told the BBC that there was “no place for calls for violence” on its platform and that “moderators removed British broadcasters calling for unrest when they were spotted during the unrest.”

“While Telegram moderators remove millions of harmful pieces of content every day, user numbers approaching one billion are causing certain growth issues in content moderation, which we are currently addressing,” a spokesperson said.

I also contacted X who did not respond to the points raised by the BBC.

X continues to communicate in its publicly available policies that its priority is to protect and defend the user's voice.

Almost every investigation I do now focuses on the design of social media sites and how algorithms distribute content that triggers a response, usually regardless of the impact it may have.

During the unrest, algorithms spread disinformation and hate to millions of people, attracting new recruits and tricking people into sharing controversial content to watch and like it.

Why doesn't that change? Well, according to my findings, companies would have to be forced to change their business models. And for politicians and regulators, that could prove to be a very big challenge indeed.

BBC InDepth is the new home on the website and app for the best analysis and expertise from our top journalists. Under a distinctive new brand, we bring you fresh perspectives that challenge assumptions and in-depth reporting on the biggest problems that help you make sense of the complex world. And we'll also feature thought-provoking content from BBC Sounds and iPlayer. We start small but think big and want to know what you think – you can send us your feedback by clicking the button below.