Elon Musk’s X: Amplifying Chaos or Ignoring Responsibility?
Amnesty International claims Elon Musk’s X platform played a central role in spreading misinformation that fueled racially charged violence after the Southport murders. The report argues X’s algorithm prioritizes engagement over safety, lacking safeguards against harm. Despite efforts, X’s design and policies still pose significant human rights risks today.

Hot Take:
Elon Musk’s X platform: making algorithms great at inciting riots since 2024! If you’re looking for a place where misinformation spreads faster than a cat meme, look no further. It seems X’s algorithm is less about curating safe content and more about starting social media bonfires. But hey, at least they have a fancy new name, right?
Key Points:
- Amnesty International blames X’s algorithm for amplifying misinformation that fueled UK riots.
- Elon Musk allegedly amplified far-right content to millions of followers.
- Nearly 1,900 arrests were made in relation to posts inciting violence after the Southport murders.
- UK’s Prevent program faced criticism for failing to intervene with the attacker.
- Calls for better regulation of algorithm-driven content amplification are growing.
Already a member? Log in here