Ways Big Tech Doesn't Fix Their Flaws (But Perpetuates Them)
According to a study we found, over 23% of Americans have an Internet connection, and half of them use the internet for up to an hour a day (the most recent data we could find was from 1996, presumably it's gone up since then). While we wouldn't trade our internet connection for a private island without Wi-Fi, constant connectivity does have its downsides. Even worse, the internet can cause a special kind of madness when the same companies that claim to be addressing their flaws are actually just perpetuating them further.
Netflix Says They Don't Encourage Binging, But Absolutely Do
Recent events are forcing people to stay inside and watch Netflix for 12 hours a day, as opposed to the mere nine hours they put in during healthier times. Netflix, ostensibly, discourages binge watching. They recently said it doesn't factor into their data tracking, and they've told the actors promoting Netflix Originals to refrain from using the term. Yes, they want you to watch their programming, but they also want you to go for nice walks and maybe even eat the odd vegetable.
But in 2017, before Netflix's Thought Crimes Department began policing the term, a Netflix press release bragged that 8.4 million subscribers had binged at least one show, then hastily clarified "before you assume that racers are just basement-dwelling couch potatoes, know that for these super fans, the speed of watching is an achievement to be proud of and brag about. The TV is their passion and Binge Racing is their sport." Truly, it takes the spirit of an elite athlete to power through that third bag of Doritos despite the growing heart palpitations. Netflix absolutely does encourage people to stay in front of a screen all day, which isn't a hard task given that 70% of Americans prefer to power through shows instead of pace themselves.
Binging has been linked to restless sleep, bad eating habits, cardiovascular problems, blood clots, mood disorders, loneliness ... it's all the usual problems of a sedentary lifestyle, plus late night blue light exposure that shreds your circadian rhythm. The same problems are caused by long stretches of gaming, extended erotic Cats fanfic writing, or any other stationary activity, but Netflix pushes it perhaps more than anyone. That's why they autoplay new episodes, that's why they autoplay trailers, that's why they dragged their feet on providing the option to disable both. That's why they experimented with a random button to get you started on something -- anything -- and that's why they tried letting kids earn "patches" for watching episodes until angry parents shut that down. Shows are even written with the assumption that you finished the last episode seven seconds ago, not seven days ago.
It's nice to get a quiet moment once in a while, but ideally without training our kids for a lifetime of compulsion.
Until health concerns started rolling in, Netflix called binging a "universal value." Investors were told it was the crux of their business model and a key part of their long-term strategy, among other accolades. At least Hulu and Disney+ are using ads and features that explicitly court viewers happy to risk muscle atrophy, rather than saying "Remember not to do what our business encourages and needs you to do!" Don't get us wrong, we're as guilty of binging as anyone, but denying its side-effects is like KFC claiming that the customers who eat their food five times a week are all just bulking up for a bodybuilding competition.
Facebook Wants To Be The Ultimate Source Of News, But Can't Be
A Pew Study found that Facebook is both one of our largest sources of news and one of our least trusted sources of news, which is like having your cake and wondering if it's full of poison too. Another recent study, in Nature, accused Facebook of spreading falsehoods more effectively than any other website that isn't MLKdid911.com. So it's difficult to look at Facebook's new News service, which promises to be the ultimate amalgamation of knowledge, with a boundless sense of optimism. Just a few months before Facebook rolled out tests, they started working with the Daily Caller -- which denies climate change and has ties to white supremacism, among other fun editorial stances -- on a factchecking initiative, which is like inviting a fox to join your shiny new henhouse patrol.
Facebook has rolled out big ad campaigns, sleek videos, and impassioned press releases about how they're dedicated to combating misinformation, but a significant chunk of their userbase uses Facebook for the misinformation. If Facebook is where you go to agree with other people that COVID-19 was created in a lab funded by George Soros to discredit Donald Trump's presidency, enforcing the borders of reality will drive some people away. Around the same time that Facebook was pledging to be more responsible, the head of their news feed was giving a mealy-mouthed explanation to CNN about why they gave Alex Jones a massive platform to spread Sandy Hook hoaxes.
Jones was later banned, but Facebook is simultaneously aspiring to be a bastion of accuracy and every internet user's central hub of information, even if that user likes to talk about how vaccines will make your penis shrink. This is a company that managed to bungle their stance on Holocaust denial when Mark Zuckerberg said it should be allowed on Facebook because "I don't think they're intentionally getting it wrong." They had to walk that one back in a hurry.
News is also not fundamentally important to Facebook's business, and previous experiments -- like the time they annihilated a bunch of websites and beloved comedy video creators with false data -- could be considered, shall we say, oopsie doopsies. Facebook cares about news in the sense that they don't want their mishandling of it to make them look bad, but it's not paying their bills. Facebook makes money from selling ads, and users who think that colloidal silver will keep the New World Order from reading their thoughts are as valuable as the users who just want to read The Atlantic. So when Facebook says they care about fighting misinformation, just mentally add "when it affects our public image."
Google And Amazon Say They Don't Manipulate Search Results, Do Just That
When you Google, say, "Animal Crossing nude mod," you want to know that you're getting the best results. The details of Google's search algorithm are kept quiet, supposedly so it can't be gamed, but Google swears they "do not use human curation to collect or arrange the results on a page." Except, as a Wall Street Journal investigation determined, they absolutely do just that.
No, they're not surreptitiously promoting one method of seeing Iggly's sweet, supple penguin flesh over another, but they do respond to business interests and government pressure, and those responses have spiked since the 2016 election. They've promoted major advertisers over their less lucrative competitors (which Google said they would never do), blacklisted sites from appearing in certain searches (which Google also said they would never do) and have helped Russia and Turkey, among other countries, censor search results (which Google never denied doing, because their official motto shifted from "Don't Be Evil" to "How Many Yachts Will Evil's Money Buy Us?" a while back).
Google, like Facebook, is torn between needing everyone to use their service and acknowledging that some people use their search engine to demand information on why abortion doctors are murdering toddlers. Employees and executives have "disagreed" on how active a role Google should take in giving people easy access to complete bullshit, but there's certainly no hesitation in stomping on the competition: employees at DealCatcher, a coupon site, woke up one day to find their traffic cut by 93% for no apparent reason, and Google never provided an explanation.
Amazon, meanwhile, changed their search engine to prioritise their most profitable products, rather than products that had the best reviews or strongest sales. This move also wasn't publicised, and it also followed a lengthy internal debate, with some of the stiffest opposition coming from lawyers who were worried about getting hammered by antitrust lawsuits. Some more extreme ideas were walked back, but the world's largest retailer is now explicitly favouring its in-house label, sales of which are expected to jump from $2 billion in 2018 to $31 billion by 2022. We'll just have to wait and see how many small businesses get curb stomped along the way.
YouTube Says It's Fighting Far Right Content, Isn't
YouTube, the site where you can watch a video about the history of Doom and then be recommended "Are SJW Game Developers DOOMing Western Civilisation? An Exclusive Look Inside The Conspiracy To Destroy Gamers' Rights!" has a bit of a problem with fringe content. In June 2019, YouTube began cracking down on far-right "content" creators -- Holocaust deniers, conspiracy theorists, and their screeching ilk. Then, in August, criticism snapped YouTube's twig of a backbone and they reinstated two far-right channels.
What were they, exactly? Oh, just one run by a neo-Nazi with ties to the Christchurch shooter, and who thinks there's a global conspiracy to exterminate white people, and another run by a white supremacist who's calling the COVID-19 pandemic the "Holocaugh" because he claims immigrants are going to use the crisis to violently gain more power in Britain. So, you know, not exactly mainstream stances that two functional adults can have a healthy disagreement over. YouTube said suspending their accounts was the "wrong call."
After being badgered for comment on the basis of "Hey, what the hell, you guys," YouTube eventually stated "We realise that many may find the viewpoints expressed in these channels deeply offensive," "many" presumably including historians, political scientists, and anyone with a functional brain stem. Then then essentially said "But hey, that's life, what are you gonna do?" Again left unstated was the fact that seeing content from an alternate universe is the only reason many people use their service. When PewDiePie, their most popular creator, is endorsing a channel that uses racial slurs while promoting white supremist talking points, enforcing remedial history will cost you money.
Meanwhile, a growing number of studies are linking YouTube to far-right radicalisation, because vulnerable, impressionable people can start watching what they think is an innocuous video about history, then six months later be telling anyone who will listen about how feminism is going to cause North America to be overrun by foreign hordes. If that sounds like hand-wringing, here's the story of a 15-year-old who used YouTube to watch Call of Duty highlights, then followed its winding trail of recommendations into Holocaust denial.
So YouTube is cracking down on extremist content... as long as it's unprofitable. As for, say, PragerU, which denies climate change, argues that all Muslims are extremists, teaches that Nazis were liberals and now they're trying to destroy Christmas, and argues that colleges with their elitist "books" and "accreditation" are just Marxist propaganda factories, well... they have 2.4 million subscribers and a large advertising budget, and while the world they propagate may not really exist, the money they make for YouTube certainly does.
Mark wrote a book and is on, ugh, Twitter.