Rights organization finds X amplified anti-migrant violence in UK after 2024 Southport stabbings News
627389 / Pixabay
Rights organization finds X amplified anti-migrant violence in UK after 2024 Southport stabbings

Amnesty International reported Wednesday that social media platform X, formerly Twitter, played a central role in fueling anti-Muslim and anti-migrant violence in the UK following the 2024 Southport stabbings.

The rights organization cited design flaws in the platform’s algorithm that made it prone to amplifying harmful content. Sacha Deshmukh, Amnesty International UK chief executive, said, “The platform’s algorithm not only failed to ‘break the circuit’ and stop the spread of dangerous falsehoods; they are highly likely to have amplified them.”

The group called on UK regulators to strengthen the Online Safety Act, investigate the algorithmic spread of hate on social media, and establish remedies for communities affected by targeted disinformation. It urged lawmakers to recognize that self-regulation by platforms like X is no longer sufficient to protect public safety or fundamental rights.

The report comes one year after 17-year-old Axel Rudakubana killed three young girls and injured ten others at a Taylor Swift-themed dance event in Southport. Within hours of the attack, far-right influencers began spreading false claims on X that Rudakubana was a Muslim or refugee. Although police later confirmed he was born in Cardiff to Christian parents and had no political or religious motive, posts promoting disinformation had already reached millions. Riots targeting Muslims and migrants erupted in cities across the UK in the days that followed.

According to Amnesty International, X’s recommender algorithm, which prioritizes content likely to provoke engagement, systematically elevated posts containing hate speech, falsehoods, and inflammatory language. The group pointed to public posts by Elon Musk and Tommy Robinson that received tens of millions of views despite containing unverified or misleading claims. Robinson’s posts alone reportedly received over 580 million impressions in two weeks.

The report also criticized X’s rollback of moderation practices following Elon Musk’s 2022 takeover. Amnesty noted that safety teams were downsized, banned accounts were reinstated, and human rights safeguards were weakened without sufficient oversight.

Although the Southport attacker had no ideological motive, social unrest escalated after misinformation linked him to immigration. Mobs vandalized mosques, attacked minority-owned businesses, and targeted asylum seekers across England and Northern Ireland. Last year, former Scottish First Minister Hamza Yousaf urged the UK government to designate the English Defence League as a terrorist organization, citing its role in organizing riots in Merseyside and spreading Islamophobic rhetoric under the guise of public mourning.

UK regulators have acknowledged the danger. In an open letter last year, OfCom urged platforms to act now to prevent similar violence, warning that duties under the new Online Safety Act will soon require stronger content safeguards.

Amnesty warned that despite the scale of violence and public concern, little has changed. Recent online rumors about asylum transfers to the Brittania Hotel in Canary Wharf prompted renewed protests, underscoring the risks posed by unmoderated viral content.