5 Things That Only Still Exist Because We're Used To Them
Everything is evolving: We stopped using candles because electric lights were cheaper and more versatile. We stopped using straight razors because disposables were more convenient, and you didn't have to worry about slitting your own throat if you sneezed. We stopped buying porno mags because the internet. But sometimes outdated ideas live on purely by inertia, like...
Lectures Are One Of The Worst Ways To Learn
The word lecture comes from the Latin "Lectus," to read. You may have noticed, however, there is very little reading involved in a lecture: A teacher speaks, and students take notes. It should be called a "dicture," but everybody kept giggling. The original lecture was actually the practice of reading aloud to students while they literally copied down every word you read. That was done because, until very recently, books were prohibitively rare and valuable.
A lecture consisted of a professor reading a book to a class full of students who hurriedly made copies of it in real time, like an in-person bit torrent. That gradually evolved into people lecturing about the subject, but it's still basically graduate-level storytime.
So why are we still doing it when books are (mostly) cheap, and information is freely available on the internet? Even back in the 1500s, people thought the printing press spelled the end of lecturing, but it persists today, so it must be the best way to learn, right?
A good deal of research has gone into more active learning approaches, and it found that they may be more effective and require less background knowledge to learn the same material. (In fact, some studies have shown no method of teaching is less effective than lecturing "including, in some cases, no teaching at all.") That doesn't mean "teaching" is an elaborate scam or anything -- just that maybe, on the structural level, kindergarten has it more together than your sophomore survey class.
Cars Have Engines In Front Because We're Afraid Of Horseless Carriages
The overwhelming majority of cars have front-engine design. Mid-and rear-engine designs are reserved for fancy sports cars and time machines. Clearly these other designs have some advantages, but there must be some principle of physics that makes putting an engine somewhere else more expensive, right? Like cost equals distance over brand name or something?
Not really. All other things being equal, it makes the most sense for a big, heavy power source to be square in the middle. It's worth pointing out that car design is a complex alchemy of art and science, and there are well-designed cars with the engine located in the front, the back, or probably hovering 10 feet off to the side in some cases -- but playing purely by averages, mid-engine wins out.
The biggest losers are the mechanics that could have charged you for naps while lying underneath there pretending to work.
Mid-engine cars have better balance and power delivery, as well as a lower "moment of inertia" -- which means that the 50:50 weight distribution allows them to turn and handle better in general. In fact, many high-end manufacturers either don't make front engine cars at all, or reserve them solely for low-end models.
In the early days of motorized cars, it was easier to convert horse-drawn carriages than to build an entirely new car from scratch. So the first adaptations of this technology put the engine in the same place as the horses it was replacing: in the front.
Everybody was just accustomed to power going in the front. You should be pulled along, as God wants, not pushed -- that's the devil's locomotion. So just to avoid a little intellectual discomfort for their consumers, manufacturers stuck with an objectively worse design. And anyway, that's why the Beetle is the world's best car. That's science. Don't argue.
Tipping Is A Remnant Of A Racist Practice
Tipping seems so obvious: Surely the Mesopotamians tipped their bartenders, or temple prostitutes, or whatever the ancient equivalent of a CrossFit trainer was (torturer?). But the truth is that for most of human history, common people bartered. Coin money was reserved for the ancient world's Rockefellers or Kardashians. So tipping a server with silver or gold would have been the equivalent of tossing in an appreciative Bugatti.
Tipping didn't come to America until after the Civil War, where it was considered "deeply un-American" and opposed by people as diverse as Trotsky, Twain, and Taft. Tipping created an automatic assumption of class division. That crap might fly in Europe, where classism is baked into the culture, but America was the land of equality. It was therefore considered rude to tip someone, since all men were considered equal. All white men anyway.
Americans had no problem insulting black people with currency. In 1902, a newspaper writer described tipping thusly: "Negroes take tips, of course, one expects that of them -- it is a token of their inferiority. But to give money to a white man was embarrassing to me."
Tipping eventually did catch on in America, because it filled an economic niche created after the Civil War: a way to legally oppress black people. When the first minimum wage laws were passed, they couldn't discriminate based on race. Most of the tipped jobs were occupied by black people. So the racists of the day -- using good ol' American ingenuity -- created the "tipped minimum wage" we know and love today. This made it legal to pay anyone who made part of their salary from tips far less than the minimum wage other workers earned. Shockingly, this loophole in the minimum wage law was mostly used to underpay black workers.
In modern times, tipped minimum wage is much the same, only now people of all races can get in on that discriminatory action: Anyone in the service industry knows that certain types of people are tipped more, and others are tipped less. It would be illegal to pay your attractive employees more, or to pay your white workers more, but tipping ensures that's exactly what happens anyway.
So why do we keep doing it? Pure habit. Some restaurants have moved away from tipping, raising their servers' salaries to a livable wage. That means they have to increase their prices on paper, even if the total cost (including tip) doesn't actually go up. But because customers are shortsighted, that's meant nothing but heartache for the restaurants.
If all restaurants made the switch simultaneously, this wouldn't be a problem. But if you could coordinate the actions of the entire restaurant industry, you could make the McRib permanently available, and that would obviously be the end of the civilized world.
Daylight Saving Time Costs Far More Than It Gives
Other than the annual grumble about a lost hour of sleep, most of us accept Daylight Saving Time as a necessary evil for the greater good. But is it? DST was initially adopted as a measure to save energy during wartime. It first became law during World War I, and then kept popping back up during other wars and energy emergencies. But like many other niceties suspended during wartime -- like, say, a huge chunk of our civil rights -- it eventually became a part of regular life.
But lighting no longer comprises a significant portion of household electricity usage. People's electricity usage mainly stems from air conditioning and electronics. So, when people leave their workplaces to return home during a hotter time of day, they use their home air conditioning more, and the energy savings aren't so clear. Until recently, most of Indiana was in the Eastern Time Zone and did not use Daylight Savings Time. Their switch to it in 2006 provided the perfect test bed: When pre-and post-DST power usage were compared, it was found that Indianians, (I-Indians? People from Indiana) used more electricity when DST was implemented. In that state alone, Daylight Saving Time was estimated to cost the population an additional $9 million annually. And the pollution caused by the increased energy usage has a social cost (mostly adverse health effects) between $1.7 and $5.5 million dollars per year.
The 2005 Energy Policy Act specifically requires Congress to conduct research on the efficacy of DST, and it states they should repeal Daylight Saving Time extensions if proposed gains are not realized; however, in classic bureaucratic form, Congress has overwhelmingly supported it and continually re-approved and extended DST in spite of scientific research telling them to do the opposite.
Why would our hard-working, honest representatives continue to enact a law that has absolutely failed to be substantiated by modern science? Well, money, of course. When people have more evening daylight, they go out shopping, barbecuing, and visit friends. This is worth millions to industries like retail stores, barbecue product manufacturers, and friend... industries?
The positive effect on these industries might be a valid argument for keeping it, if the negative outcomes were restricted to economic effects, but DST is bad in so many ways. The day of the spring-forward time jump is one of the most hazardous of the year for strokes and heart attacks. It drastically increases traffic accidents (as we already covered here). Additionally, by getting up an hour earlier, people are less rested for the entire summer, which in turn creates an increase in cyberloafing behavior (read: wasting more time at work slacking off and reading humorous online articles).
No comment.
People also have higher levels of the stress hormone cortisol during DST, and are generally more depressed throughout the summer. It even seems to drive Australians to higher rates of suicide, and those crazy bastards manage to survive in a country with croc-eating pythons.
Grand Juries Actually Serve The Opposite Purpose They Were Intended For
In medieval England, one of the main ways to determine someone's guilt or innocence was "trial by ordeal." In this case, "ordeal" is a hilariously understated way of saying "nearly kill them through boiling water, drowning, or hot irons, and see if God saves them." So you can understand why an accusation even going to trial was a bit of a hassle for the accused. Rather than kill someone every time God was taking a nap, authorities decided to put a check on this system with grand juries, who would decide if a case even warranted going to trial.
As trials became slightly less barbaric, going to trial still remained quite a burden for common folk, so the idea of grand juries evolved with the times and were exported to England's colonies. After the U.S. split from England, the grand jury remained a right guaranteed to every American standing in a federal court by the Fifth Amendment. Grand Juries were included in the Bill of Rights as a check on prosecutorial power. Remember that our courts operate on an adversarial system, but one of the two adversaries has the considerable resources of the government backing them up. While that's an admirable goal, the modern incarnation of grand juries have all the drawbacks of being an outdated, stupid system while also retaining all the drawbacks of being a highly modernized bureaucratic waste of resources. As our founding fathers almost certainly would have put it: They suck balls and they don't protect anyone from shit.
In 2010, roughly 162,000 suspects were pursued in criminal proceedings by U.S. district attorneys, and out of all of these, only eleven failed to receive a bill of indictment by a grand jury. You are literally more likely to be struck by lightning than be acquitted by a grand jury. Unless you are a police officer, that is. One New York Chief Judge, Sol Wachtler, said, "any prosecutor who wanted to could get a grand jury to indict a ham sandwich."
How are prosecutors able to boast such an impressive success rate? Turns out, there is practically zero supervision for the grand jury process, and defendants aren't even allowed to be present (much less defend themselves). Grand Juries are cloak and dagger affairs conducted under a level of secrecy normally reserved for nuclear launch codes and the Colonel's secret ingredient.
And even if everything were out in the open, the usual protections for defendants in a trial do not apply to grand jury proceedings. The U.S. Supreme Courts have held that the standard for evidence admissible to a grand jury is frighteningly low: The Fourth Amendment (freedom from search and seizure) is inapplicable to grand juries, hearsay is a valid piece of evidence, and evidence presented before a grand jury can impact the inevitable trial to come. This means that in some states prosecutors can (and do) bring unreliable witnesses before grand juries and use their unreliable testimony against the defendant (who didn't get to question the witness, because they weren't at the grand jury proceedings).
Despite the fact that grand juries were created to be a bulwark against "ordeals" from false accusations -- getting boiled to death can be a real hassle -- in the modern era, they're at best a waste of resources, and at worst an unfair advantage given to government prosecutors. They're a relic of medieval law that the framers of the Constitution hoped would serve as a shield against the unfounded prosecution of American citizens, but unfortunately, often result in the unjust harassment by unrestricted prosecutors. The U.S. and Liberia are the only countries in the world that continue the practice. No offense to Liberia, but maybe that's not such a good thing.
John Martin is a history teacher who also does other stuff; you can buy things from him here. Luke Miller used to keep the skies safe as an Air Traffic Controller, but, right now, he writes dick jokes for Cracked.
For more ideas we might want to rethink, check out 5 Bad Ideas Humanity Is Sticking With Out Of Habit and 5 'Innocent' Things We Do (Are Environmentally Catastrophic).
Subscribe to our YouTube channel, and check out Officially the Worst Movie Idea Anyone's Ever Had, and other videos you won't see on the site!
Follow us on Facebook, and we'll follow you everywhere.