6 Ways Driverless Cars Are Going To Kill Lots Of People
You've probably read a few articles about driverless cars over the past couple of years. The technology is coming along quickly, with fleets of test cars already on the roads in some states. It seems like soon we'll achieve the American dream of stuffing our faces and texting all we want while still managing to avoid public transportation.
But the reality is quite different. We're diving into this technology a little too quickly and ignoring all the warning signs about how we are going to screw up on the way to Driverless Car Utopia.
Self-Driving Cars Are More Likely To Get In Accidents -- For Now
By the middle of the 21st Century, once we are all puttering around in our Jetson-mobiles, driverless cars will probably live up to their promise of being safer -- up to 90 percent safer than cars which depend on our stupid human brains to control them, in fact. That is undeniably awesome, and would save more than a quarter of a million lives each decade, as well as hundreds of billions in healthcare costs. It will totally make up for driving around in what appears to be a Pokemon that walked in on its parents having sex.
Cannot. Unsee.
But until then, according to the studies we have available so far, driverless cars are far less safe than regular cars. One out of every 12 in California got in accidents over the course of just six months. And a longer study of all accidents involving autonomous cars between 2012 and 2015, compared to those involving just regular ones in 2013, found that the former were five times more likely to get into crashes. Even when they controlled for the fact that people don't usually report minor dings or fender benders, the self-driving cars were still two times as dangerous.
And even though there are only a handful of them on the roads, they have already been responsible for one death. In May, a Tesla driver in Florida put his car in self-driving mode and popped a Harry Potter film in his portable DVD player. Then his car decided to make a simple lane change. Unfortunately, it couldn't detect the difference between a clear blue sky and the large white side of an 18-wheel truck, and bam, J.K. Rowling could add another innocent person to her kill list.
Never forget.
If they are going to be perfect one day, why so dangerous now? Well ...
Self-driving Cars Always Follow the Law -- And That's Bad
Like some kind of ridiculous goody-two-shoes who brags about how well they did on their driving test and the fact that they have never gotten a traffic ticket, autonomous cars are programmed to absolutely always follow the law. Again, one day this means they will be a lot better than people, because they won't be so distracted by rocking out to Whitesnake that they blow through a stop sign.
"I promise this has a perfectly logical explanation, officer ..."
But for now, while they are on the roads with cars that still have drivers in them, it means they can't react to ambiguity. NO ONE follows traffic laws all the time. How many times have you sped up to make a red light, or merged into traffic that was going above the speed limit? You have to be able to balance what is legal with what is safe in the moment. Driverless cars won't be able to make decisions like that, but are surrounded by people who do, which will lead to accidents. In fact, all those crashes in the studies mentioned in the first point involved some form of human error -- usually a regular car hitting an autonomous car -- because the human driver expected the self-driving car to act how a person would in that situation, not a computer.
"It probably says 50 under there."
This is why some lawmakers are getting nervous about this new technology. After Uber launched four driverless cars in Pittsburg in September, a Chicago alderman preemptively proposed an ordinance trying to ban them in that city. He said he doesn't want the streets of Chicago to be used "as an experiment that will no doubt come with its share of risks, especially for pedestrians." He's probably right; potential pedestrians don't need to add "getting run over" right after "getting shot" on their list of reasons not to walk around Chicago.
People Like Steering Wheels -- But They Make Cars More Dangerous
Right now, there is a fight brewing about how to introduce driverless cars to the populace. Google's driverless cars don't come with steering wheels, but most actual auto manufacturers want theirs to. This way, they can kind of ease people into the idea; you could buy a car knowing that you could put it in self-driving mode on the highway but take over during more complicated bits of driving. A study by Volvo found that virtually everyone, or at least 92 percent of you, would want a steering wheel in their first autonomous car. And a report from earlier this year by the U.S. Department of Transportation warned that federal regulations might mean cars without steering wheels weren't street legal.
But turning your dashboard into a giant distracting flat-screen TV is a-okay.
But despite your total control freak need to take over for the car when you want to, this actually makes them more dangerous. At first it sounds crazy. After all, couldn't that guy from Florida have saved himself if he had noticed the car was about to make a mistake and grabbed the wheel? Without a wheel, he (and you) would be at the total mercy of the car.
Forget 10 and two; correct driving procedure is now hands in your lap, clenched in terror.
In reality, the moment of the switchover is rife with problems, similarly to how the most dangerous part of flying a plane with autopilot is when the actual pilot takes charge. What if a person hasn't been paying perfect attention and doesn't know the exact traffic conditions, or how fast they are going, or that there's a serial killer in the back seat, etc.? And what happens as we spend more and more time letting our cars do all the work and get out of practice? Plus, getting rid of our ability to break or bend the law means we are that much closer to that Utopia. We just need to learn to let go, literally.
They Can Be Thwarted By Bad Weather And Incorrect GPS
One day, all the autonomous cars will be connected to a computer network that lets them know not just where all the roads are, but also where hazards and other cars are. It will update quickly and pretty much always be current. But until we can get the boring cars piloted by slightly more sentient apes off the street, individual driverless cars will rely on lasers, cameras, and GPS to know where they are headed. As anyone who has ever driven into a lake or had a quick trip to the train station turn into a 800-mile journey because they were thoughtlessly following what their GPS told them to do will tell you, this is a BAD idea.
"No, I do NOT need to stop and ask for directions. The mall will be over the next sand dune. You'll see."
A study by MIT released in August of 2014 found that Google's self-driving cars relied so much on detailed maps that they couldn't get around 99 percent of America safely. Obviously, it's gotten a little bit better over the past two years, but it's slow going. People who live in the middle of nowhere are not going to get safe access to driverless cars at the same time as people in major cities, so if they want to be an early adopter, they are going to have to risk crashing into a cow or two.
Of course, that can happen in regular cars, too.
And the lasers and cameras the cars use to make decisions are even worse. As we already saw, they can't tell the difference between the side of a truck and clear sky, to disastrous results. But Google cars can also be blinded by bright sunlight, heavy rain will completely confuse them, and the company hasn't even bothered testing in snow yet. In other words, check the weather before you leave the house, and if the report is "any," you are better off just staying home.
Self-Driving Cars Will Be Programmed To Kill
Even in Self-Drivin Car Utopia, there will be unavoidable accidents. No technology is perfect. In that split second, the car has to make a decision about whom to save. That means it has to decide who to (possibly) kill. And that is why driverless cars will bring us into a brave new world of ethics. Think tanks are already working on this problem, because we're going to have to decide if we want to be driving around in the real-life version of Christine. Look at the image below. In each section, assume the car swerving to hit the wall would kill the occupants of the car. In which situation would you want a theoretical car to swerve?
Now imagine that YOU are in that theoretical car. Does that change any of your answers? In a study of 2,000 people, 76 percent said that even if they were in a car with loved ones, the car should do what was necessary to save the most people. However, it's easy to be utilitarian in the hypothetical. When asked if they would actually BUY a car that would save their lives as passengers over those of others, people changed their minds. Given the choice, people would buy the car that saves them, so that's what companies would want to produce. Unless the government got involved and mandated that all driverless cars act in a utilitarian manner, good luck to any pedestrian, bicyclists, or people in other cars.
Make the kid cross first, just in case.
And when an accident does happen, whom do you sue (this is America, so you KNOW someone is paying for this shit)? Is it the car manufacturer's fault? The owner's? The pedestrian's? Or do we all just storm Google's headquarters, pitchforks in hand, for making us switch in the first place?
They Are Dream Targets For Hackers And Terrorists
So we've done it. We made it to the Self-Driving Utopia. All those ridiculous steering-wheeled cars have been thrown into the Grand Canyon to rust. And finally, the autonomous cars can connect to one big network so everything can run smoothly, finally. Right? RIGHT?
Of course not.
It turns out all of the benefits are also equally huge downsides in the hands of bad guys. Reports have warned that terrorists could use this new technology by filling cars with explosives, then having them drive to a destination with no one inside. And the FBI has already studied how criminals could become more dangerous once they have the ability to shoot at police during car chases without having to watch the road.
And any time you make something computerized or put it on a network, you are opening it up to hacking. Think tanks are already worried about what could happen if hackers took over cars while people were inside them. The best-case scenario is that hackers take control safely but demand money to give it back -- what's called "ransomware attacks." More concerning would be taking over cars and using them to kidnap the occupants, or turning them into deadly weapons. Or what if they managed to take over the whole system and brought entire cities, or even countries, to a halt?
God help us, WHAT IF WE HAVE TO WALK SOMEWHERE?
But that would probably take some Swordfish-level hacking abilities, right? And since cars won't run on a network for decades, we're at least all safe from this until then, right? Nope! One security company found that you can "hack" a driverless car's systems using ... a laser pointer.
Maybe walking isn't so bad after all.
One bad sneeze can murder you. Make sure to say God bless you as you read 6 Things Your Body Does Every Day (That Can Destroy You) and learn how skinny jeans are destroying your body in 26 Things You Do Every Day That Are Bad For Your Health.
Subscribe to our YouTube channel to hear the unfiltered truth about cars in If Car Commercials Were Honest - Honest Ads, and watch other videos you won't see on the site!
Also follow us on Facebook because summer is over, which means it's time to stay indoors and read Cracked!