This is absolutely terrifying: YouTube has a “conspiracy ecosystem”.
YouTube viewers who started searching for information on â€œcrisis actorsâ€ â€” people who supposedly play roles as mass shooting survivors to push gun control â€” could soon find themselves tumbling down a rabbit hole of conspiracies about the the Sept. 11, 2001 attacks, the JFK assassination and Pizzagate, the hoax about a supposed child molestation ring run by Democratic Party luminaries out of a Washington pizzeria.
â€œItâ€™s a conspiracy ecosystem,â€ said Albright, research director at Columbia University’s Tow Center for Digital Journalism. â€œItâ€™s growing, not only in size but in depth.â€
Exactly the same thing happens on Facebook.
The problem is “trending” content, the stuff you’re recommended by Facebook and YouTube’s algorithms, which then leads to other things.
As Frederic Filloux writes in his Monday Note newsletter:
For both YouTube (the worldâ€™s main provider of videos) and Facebook (the dominant vector of fake news), solving this problem would actually be easy: kill Trending Topics, which has a terrible track record. But neither tech giant will do that, because thatâ€™s where the advertising money is.
That money is mainstreaming extreme views. Some of the people who subscribe to the “crisis actor” bullshit are violent bigots; therefore if you view some crisis actor bullshit you’re likely to see other content relevant to violent bigots. It’s not long before you’re in very disturbing territory.
As the columnist Christopher Mims notes:
Facebook is a unique enabler of extremism, full stop. â€œIf itâ€™s outrageous, itâ€™s contagiousâ€ is literally the bedrock, fundamental modus operandi of its engagement-optimizing algorithms.