YouTube's Recommendation Algorithm Has Taken (Baby) Steps
Once upon a time, looking up a Youtube video on literally any subject meant you were instantly recommended a slew of conspiracy theory videos that YouTube's algorithm assumed you would also be interested in. ("Want to learn how to crochet a Baby Groot? Might we recommend this two-hour video explaining that the Sandy Hook shooter was a government agent?") YouTube was well aware of their reputation as a place everyone could meet to become collectively and aggressively dumber, so they made a concerted effort to alter their recommendation algorithms to make conspiracy videos less likely to pop up. The good news is that it worked! The bad news is that it only worked a little bit! Baby steps!
A study out of the University of California, Berkeley, found that you're now 40% less likely to get suggested videos with thumbnails of Hillary Clinton, in a black cloak, lit only by the red glow of her third eye, and the word "ILLUMINATI?!" stamped diagonally across her face. The shift has also caused a 50% drop in the amount of time people actually watched the conspiracy theory videos in their recommendations.
The weird thing though, is that the 40% was actually higher at first. YouTube initially saw a 50% decrease in conspiracy theory suggestions, which then jumped to 70%. But because of how inherently popular these videos are without needing to be spread via the recommendation algorithm, the number then dropped to 40%. That means conspiracy theory videos have an enormous built-in audience that YouTube's manipulation can't stem. People need their daily fix of mind-blowingly stupid lies spread by guys whose expertise on the subject begins and ends with something someone else said on a bodybuilding forum.
YouTube has managed to prevent a lot of the flat earth and 9/11 truther stuff from mucking up recommendations, but other types of conspiracy theories are still as popular as ever. They're usually the classic harmless ones like that aliens built the pyramids ... or that climate change is a lie.
Huh, that's a weird one to have not yet kicked off a cliff. The researchers theorize that it likely has something to do with the fact that YouTube is clearly handpicking which conspiracy theories they'll allow in suggestions and which to expunge. Based on what, who knows. Public outcry? Random lottery? Google's buddies within the oil industry (who they work with directly to help find more oil) telling them to leave it up? All important questions to ask, that we hope get packed into a YouTube conspiracy video to see if it ends up in everyone's recommendations.
Luis can be found on Twitter and Facebook. Check out his regular contributions to Macaulay Culkin's BunnyEars.com and his "Meditation Minute" segments on the Bunny Ears podcast. And now you can listen to the first episode on Youtube!