If it’s outrageous, it’s contagious. And dangerous

This is the way the world ends. Not with a bang but an algorithm.

In the New York Times. Zeynep Tufekci describes YouTube’s radicalisation problem. No matter the starting point, it recommends increasingly extreme content.

YouTube has recently come under fire for recommending videos promoting the conspiracy theory that the outspoken survivors of the school shooting in Parkland, Fla., are “crisis actors” masquerading as victims. Jonathan Albright, a researcher at Columbia, recently “seeded” a YouTube account with a search for “crisis actor” and found that following the “up next” recommendations led to a network of some 9,000 videos promoting that and related conspiracy theories, including the claim that the 2012 school shooting in Newtown, Conn., was a hoax.

What we are witnessing is the computational exploitation of a natural human desire

We like conspiracies. We want to know the news THEY don’t want us to see, the products THEY tried to ban, the secrets THEY don’t want us to know. And such bullshit has been around for centuries.

What’s different is that previously, the bullshit wasn’t mainstream. The much-derided media “gatekeepers” ensured that this shit didn’t spread beyond very small groups of people. Extreme and unhinged voices were largely unable to get a platform.

Now, we don’t have gatekeepers. For younger people YouTube and Facebook are their BBC and CNN, and there’s often an assumption that if it’s on these sites it must be okay. And it’s not okay. It’s far from okay.

Extremist content isn’t just being uploaded; it’s staying up. Good luck reporting actual Nazis to Twitter, or actual Nazi propaganda to Facebook, or bigotry and hate speech on any social network.

Free speech über alles. Fuck the consequences.

The “if it’s outrageous it’s contagious” approach prioritises the worst of us. It has turned social media into a very dangerous weapon.

We’ll be reaping the whirlwind for a long time to come.


Posted

in

,

by