LONDON — Within hours of a stabbing attack in northwest England that killed three young girls and wounded several more children, a false name of a supposed suspect was circulating on social media. Hours after that, violent protesters were clashing with police outside a nearby mosque.

Police say the name was fake, as were rumors that the 17-year-old suspect was an asylum-seeker who had recently arrived in Britain. Detectives say the suspect charged Thursday with murder and attempted murder was born in the U.K., and British media including the BBC have reported that his parents are from Rwanda.

That information did little to slow the lightning spread of the false name or stop right-wing influencers pinning the blame on immigrants and Muslims.

“There’s a parallel universe where what was claimed by these rumors were the actual facts of the case,” said Sunder Katwala, director of British Future, a think tank that looks at issues including integration and national identity. “And that will be a difficult thing to manage.”

Local lawmaker Patrick Hurley said the result was “hundreds of people descending on the town, descending on Southport from outside of the area, intent on causing trouble — either because they believe what they’ve written, or because they are bad faith actors who wrote it in the first place, in the hope of causing community division.”

One of the first outlets to report the false name, Ali Al-Shakati, was Channel 3 Now, an account on the X social media platform that purports to be a news channel. A Facebook page of the same name says it is managed by people in Pakistan and the U.S. A related website on Wednesday showed a mix of possibly AI-generated news and entertainment stories, as well as an apology for “the misleading information” in its article on the Southport stabbings.

By the time the apology was posted, the incorrect identification had been repeated widely on social media.

“Some of the key actors are probably just generating traffic, possibly for monetization,” said Katwala. The misinformation was then spread further by “people committed to the U.K. domestic far right,” he said.

Governments around the world, including Britain’s, are struggling with how to curb toxic material online. U.K. Home Secretary Yvette Cooper said Tuesday that social media companies “need to take some responsibility” for the content on their sites.

Katwala said that social platforms such as Facebook and X worked to “de-amplify” false information in real time after mass shootings at two mosques in Christchurch, New Zealand, in 2019.

Since Elon Musk, a self-styled free-speech champion, bought X, it has gutted teams that once fought misinformation on the platform and restored the accounts of banned conspiracy theories and extremists.

Rumors have swirled in the relative silence of police over the attack. Merseyside Police issued a statement saying the reported name for the suspect was incorrect, but have provided little information about him other than his age and birthplace of Cardiff, Wales.

Under U.K. law, suspects are not publicly named until they have been charged and those under 18 are usually not named at all. That has been seized on by some activists to suggest the police are withholding information about the attacker.

Tommy Robinson, founder of the far-right English Defense League, accused police of “gaslighting” the public. Nigel Farage, a veteran anti-immigration politician who was elected to Parliament in this month’s general election, posted a video on X speculating “whether the truth is being withheld from us” about the attack.

Brendan Cox, whose lawmaker wife Jo Cox was murdered by a far-right attacker in 2016, said Farage’s comments showed he was “nothing better than a Tommy Robinson in a suit.”

“It is beyond the pale to use a moment like this to spread your narrative and to spread your hatred, and we saw the results on Southport’s streets last night,” Cox told the BBC.

коментуйте: