Today, Mark Zuckerberg announced changes to Facebook, Threads and Meta, initially for the US because EU law won’t allow such changes. The short version: hate speech is back, baby!
You’ll see the details elsewhere so I won’t repeat them here, but ultimately the goal is to remove safeguards and moderation from Meta’s platforms. That freedom for Facebook in particular is new for the US, but it’s not new for Facebook: we saw it in Myanmar, where Facebook was instrumental in genocide.
That’s not just my opinion; it’s the opinion of UN investigators and of Amnesty International too. As Amnesty put it: “While the Myanmar military was committing crimes against humanity against the Rohingya, Meta was profiting from the echo chamber of hatred created by its hate-spiralling algorithms.”
Amnesty:
Internal studies dating back to 2012 indicated that Meta knew its algorithms could result in serious real-world harms. In 2016, Meta’s own research clearly acknowledged that “our recommendation systems grow the problem” of extremism…
In one internal document dated August 2019, one Meta employee wrote: “We have evidence from a variety of sources that hate speech, divisive political speech, and misinformation on Facebook… are affecting societies around the world. We also have compelling evidence that our core product mechanics, such as virality, recommendations, and optimizing for engagement, are a significant part of why these types of speech flourish on the platform.”
Hate speech is the oil of Meta’s business, and Zuckerberg doesn’t care about the human cost.
As tech journalist and long-time Meta critic Ed Zitron writes on Bluesky:
Meta will burn down everything in search of growth. They have been doing so in broad daylight for years. They will make people angry and sad and hateful (and have done so before) in search of growth in their dying platform. They will make everything worse to create growth. It’s a death cult