The Terrible Truths Apocalypse Movies Reveal About Us
Lately, you can't swing an undead cat without hitting the end of the world. Whether it's the pure escapism of The Walking Dead or high art like The Road, every story hits the same beats: it's every man for himself against the hoard of zombies or cannibals or nuclear Nazi mutants, and the only ones who survive are the badasses with shotguns willing to shoot their best friends for a Pop Tart. Even your general post-apocalyptic dystopia movies fall in the same lazy traps -- every city looks like a warehouse or a desert, not because it's an accurate depiction of such a society, but because they're cheaper to film in.
But whether it's a case of art reflecting society or storytellers accidentally tapping into a human desire -- and running with it until it's as bloated and decayed as their crew of extras -- it says something about us that's pretty uncomfortable to think about. The apocalypse is just the new Old West, but instead of conquering the "savages," we just revel in being way better at savagery. We like the idea of the fall of civilization because it allows us to elevate basic human decency to heroic decency, and escape the complications of the modern world -- even if most of us would die without it. But that's why every disaster in the history of ever has shown that it doesn't play out that way at all.
This week, Cracked editors Jack O'Brien, Soren Bowie, and Jason Pargin (aka David Wong) discuss the rote ways in which apocalypse movies and TV shows always get the demise of civilization wrong. Throw on your headphones and click play above, go here to subscribe on iTunes, or download it here.