Today's cybersecurity headlines are brought to you by ThreatPerspective


Ethical Hacking News

Amyrthy's Reckoning: The Role of X in Fueling UK Violence


Amyrthy's Reckoning: The Role of X in Fueling UK Violence

  • Amynesty International has accused X platform of fueling racially charged violence following Southport murders.
  • X's recommendation system prioritizes content that sparks outrage, provokes heated exchanges, and generates engagement without adequate safeguards to prevent harm.
  • Far-right and Islamophobic content from accounts like Europe Invasions was amplified on X, generating a significant number of impressions.
  • Elon Musk's tweets contributed to the discourse, with 46 tweets generating 808 million impressions.
  • Tommy Robinson's posts were also highlighted as a key driver of racist discourse, with estimated 580 million impressions.
  • X claims to be committed to keeping its platform safe for all users, but Amnesty argues that algorithmic design and policy choices are more significant contributors to the risks posed by the platform.
  • The controversy highlights the need for greater accountability and stronger enforcement of Online Safety Act in the UK.


  • In a scathing critique, Amnesty International has accused Elon Musk's X platform of playing a central role in fueling racially charged violence following last year's Southport murders. The human rights organization's assertion is based on an in-depth analysis of the platform's algorithmic design and policy choices, which Amnesty claims contributed to heightened risks amid a wave of anti-Muslim and anti-migrant violence observed in several locations across the UK.

    According to Amnesty's report, X's recommendation system, which was open-sourced in 2023, systematically prioritizes content that sparks outrage, provokes heated exchanges, reactions, and engagement, without adequate safeguards to prevent or mitigate harm. The organization's analysis of the source code behind the algorithm revealed that engagement was prioritized over safety, with no mechanism to assess the potential for harm carried by the posts it ranked.

    Amnesty cited instances where far-right and Islamophobic content from accounts such as Europe Invasions was amplified on X, generating a significant number of impressions. The organization also highlighted Elon Musk's own role in fueling this discourse, citing his 46 tweets during the UK riots, which generated a total of 808 million impressions.

    Furthermore, Amnesty pointed to another key driver of the racist discourse that contributed to the riots: Tommy Robinson, whose real name is Stephen Yaxley-Lennon. A figure widely associated with Islamophobic and extremist content, Robinson's posts are estimated to have generated over 580 million impressions, according to data from the National Police Chiefs' Council.

    The UK government has launched various reviews and investigations into the events leading up to and following the riots, with a focus on bringing participants to justice. However, Amnesty argues that X's role in fueling this violence must be acknowledged and addressed. "Without effective safeguards, the likelihood increases that inflammatory or hostile posts will gain traction in periods of heightened social tension," Pat de BrĂșn, Amnesty International's head of Big Tech accountability, warned.

    In response to these allegations, X has claimed that it is committed to keeping its platform safe for all users. The company stated that its safety teams use a combination of machine learning and human review to proactively take swift action against content and accounts that violate its rules. However, Amnesty's report suggests that X's algorithmic design and policy choices may be more significant contributors to the risks posed by the platform than the company is willing to admit.

    The controversy highlights the ongoing need for greater accountability and stronger enforcement of Online Safety Act in the UK. Dame Chi Onwurah, chair of the Science, Innovation and Technology Committee, has stated that the act "just isn't up to scratch," citing a failure to legislate against the algorithmic amplification of harmful content.

    As the debate over social media regulation continues, it is essential to consider the role of platforms like X in fueling online violence. By examining the platform's algorithmic design and policy choices, we can better understand how these factors contribute to the spread of misinformation and hate speech. Ultimately, this knowledge will inform our efforts to create a safer and more accountable online environment for all users.



    Related Information:
  • https://www.ethicalhackingnews.com/articles/Amyrthys-Reckoning-The-Role-of-X-in-Fueling-UK-Violence-ehn.shtml

  • https://go.theregister.com/feed/www.theregister.com/2025/08/07/amnesty_x_uk_riots/


  • Published: Thu Aug 7 10:58:19 2025 by llama3.2 3B Q4_K_M













    © Ethical Hacking News . All rights reserved.

    Privacy | Terms of Use | Contact Us