4 Famous Statistics It Turns Out Someone Just Made Up
They say 80 percent of statistics are made up on the spot. That still leaves up to 20 percent of statistics to be made up in advance, and this category of misinformation spreads especially widely, so beware.
No, you don’t swallow eight spiders a year, and no, mattresses don’t double in weight from all that skin you shed. Guard yourself similarly strongly against such other pieces of common knowledge as...
Myth: 70 Percent of Lottery Winners End Up Bankrupt
This article not your thing? Try these...
The Myth: If you hear that 70 percent of lottery winners go on to declare bankruptcy, you’ll delightedly squirrel that fact away, for one reason — sour grapes. You will never win the lottery (because you’ll never buy a ticket, and even buying a ticket wouldn’t raise your chances very much). What a relief then to realize that even if you did win, it would probably ruin your life.
The Reality: Sources attribute that stat to the National Endowment for Financial Education (NEFE), without any details on how that group came up with it. In 2018, responding to all this coverage, NEFE came forward and clarified the matter: They hadn’t come up with it.
They’d hosted a think tank in 2001, and someone there proposed that number, but the group has nothing backing it up. They don’t support it now, and they don’t even remember who it was that said it then. It might have been a financial planner, who was just explaining the potential risk of windfalls but didn’t have any data on lottery winners overall. It might have been a psychologist (the think tank included psychologists).
Another version of the stat you’ll see puts winners’ bankruptcy chances at 30 percent, or “nearly one-third.” Sources attribute this to the Certified Financial Planner Board of Standards, but again, no one is able to name the study that came up with that number, and you’ll see no mention of it on the organization’s own site. The earliest mention we could find of the stat was a book, citing a 2008 news article. Years later, the reporter behind that article was still repeating that stat but linking to random websites that repeated it in a single line, never to details.
You need details because winning the lottery means a lot of different things. If you win up to $150,000, like the winners in one Florida study, that will help you out, but you won’t be set for life. Expand your scope to include people who win millions, like one large Swedish study did, and you’ll find that winners are likely to end up happy. But there’s not much sense in applying either of those conclusions to people who win close to a billion, which is so rare that we’ll never have enough cases to draw general predictions of what that’s like.
We ourselves will gladly share with you stories of lottery tickets that bring people nothing but misery, but that’s not because winning will probably do that. It’s because stories where that does happen are so interesting.
That, and sour grapes.
Myth: The Brain Goes on Developing Till You’re 25
The Myth: At 18, kids have to make huge decisions that will affect the rest of your life, which is nuts, considering they don’t know anything about anything. The situation becomes even shakier when we learn that rationality can be biologically measured, and 18-year-olds don’t yet have it. The rational part of your brain only finishes developing at 25. Perhaps then no one should get a bank account till they’re halfway through their 20s, and if you’re a 27-year-old thinking of dating a 24-year-old, forget it — you’re preying on a baby imbecile.
The Reality: Sure enough, the prefrontal cortex does go on growing once the teen years end. But there’s nothing significant about age 25. It just so happened that the first big study to use MRIs to analyze brain development used subjects that were 25 years old max. It noted that the brain develops even as someone ages from 24 to 25, but it said nothing about that being the peak, as it studied no one older than that.
Even if growth did peak at 25, size would be a dubious way of deciding when someone becomes rational or mature. Your volume of gray matter peaks when you’re seven years old. The part of your brain associated with perception and consciousness is thickest when you are two years old.
White matter peaks at age 30 and then declines, which might suggest that 30 is when you are fully mature. But then, does that mean someone who’s 40 is less mature or rational than someone who’s 30? Surely not. In fact, the neurons within this white matter go on creating new connections throughout your entire life. Your brain never stops developing. Even when your brain does get weaker, it continues to develop, in the form of increasingly large ventricles — hey, development isn’t always a good thing.
We do need to pick an age at which we consider people to be adults, but that’s something we have to simply decide, by consensus, rather than by measuring brain sizes. Maybe we’ll one day conclude no one’s an adult until they’re 21, or maybe society will shift and kids will all be fully rational at 17. Maybe we can set different ages for different stuff, like we do driving and voting, because different actions require different levels of maturity.
But we can’t base everything on waiting for brain development to end. If we did, we’d have to wait till many years after death, once total brain decomposition has completed.
Myth: Millions of Americans Think Chocolate Milk Comes From Brown Cows
The Myth: Or maybe some people’s brains never develop, based on this stat that circulated in 2017. A survey said 7 percent of Americans think chocolate milk comes from brown cows. Clearly, this shows Americans are uneducated. Or that Americans have lost touch with their rural roots. Or maybe it’s says something about diet, but whatever it says, it surely says something, right?
The Reality: Here’s our own personal estimate of how many Americans think chocolate milk gets its nature from the color of the cow: zero. Maybe you could ask someone what color cow makes chocolate milk, and they’d answer “brown,” but that doesn’t reveal what they think about the origins of chocolate. It just reveals something about people’s thought process when presented with certain types of questions, and if you point out what’s absurd about that answer, they’ll agree and laugh.
So, the real question here is what was this survey trying to find? Was it, in fact, a language survey on how people respond to trick questions? Was it a survey that aimed to test the educational system? Turns out it was a survey commissioned by a marketing firm that had been hired by the dairy industry. They never released the full survey results because the full results were never important. The important thing was to get everyone thinking about buying milk, and all they needed was one ridiculous stat to announce, and the media would do the rest.
Since they didn’t release the full survey, we don’t know how they worded the question, but we expect it was multiple choice, the way most surveys like this are. A certain percentage of survey respondents pick answers to such questions randomly, and surveys have to anticipate this and accordingly weed out those respondents. Some surveys include ridiculous options for precisely that reason. They should then exclude people who picked the ridiculous answers, not publicize them.
But if the survey presented such options as “black and white” and “brown,” “brown” isn’t even really a ridiculous answer. Many dairy cows are brown. Black and white is the most common (those are Holstein cows), but brown Jersey cows are the most common breed after that, while several other breeds are brown as well.
Oh, and even if a poll of a thousand people genuinely did find that 7 percent believed something, news articles shouldn’t extrapolate that to “about 17.3 million Americans” like CNN does here. While we’re urging people to learn how confectionary works, let’s also make sure we know how significant figures work.
Myth: Medieval Peasants Worked Less Than You Do
The Myth: If you feel particularly miserable over having to work for a living, here’s something to make you feel even more miserable: It’s turns out you’re working more than people were during what’s known as the most miserable and exhausting time in history. Apparently, before centuries of revolution and reform, medieval peasants worked less than you do now, totaling just 150 days of labor each year.
The Reality: The “150 days” stat spread thanks to a book called The Overworked American by Juliet Schor, which cited a 1986 paper by a historian named Gregory Clark. Clark estimated that peasants worked 12 hours a day, so 150 days a year would add up to a hair more than the number of hours the average American works a year, but it’s still 40 percent fewer days.
We’d love to dig into the details of that paper, but we can’t because we don’t have it, as it was a working paper that doesn’t appear to have ever been published publicly. What we do have is Gregory Clark, who’s still alive to comment on the issue. Clark says he’s now revised his estimate and says the number of days a peasant worked would be more like 300 days. That’s 40 percent more days than the average modern American and more than double the hours.
via Wiki Commons
These are all estimates because we don’t have anything surer than that. We have records from the manors that employed peasants (manorial records, to use a word you will never again see), but they aren’t comprehensive enough. We can’t look at letters or diaries that peasants themselves wrote because peasants were illiterate. So, historians turn to other means, including determining peasants’ purchases and working backwards to calculate the minimum hours of labor that would have funded those purchases.
But let’s say peasants did work just 150 days. Other historians think that figure might not be so off, at least at certain points in history, if peasants worked fewer total hours than Clark estimated and worked even longer days. Would you work all day for a couple seasons, if it meant longer vacations? Some of you would say no. Some of you might say yes.
via Wiki Commons
The problem was, time off from working for their lord didn’t really mean free time. A peasant back then still had to labor every day just to go on living.
If you, today, have to work 150 days to pay off your own lord (your landlord), that wouldn’t mean you have 200 days free. You still need to be able to afford food, clothes, ping-pong balls and all the other essentials of life. For you, that means earning more money and then buying things. For a medieval peasant, it meant endless daily chores beyond what they did in the lord’s fields.
For starters, those chores included working on your own field, tending to crops and animals so you wouldn’t starve. You’d have to cook that food, and preserve that food. You would have to gather firewood in the winter so you wouldn’t freeze. You’d have to keep repairing your flimsy home so it wouldn’t fall down on you. You’d have to make clothes, then mend clothes, and also wash clothes. Washing meant making your own soap and fetching water from elsewhere, and for really thorough laundry, it meant setting a whole day aside for a trip to the communal river.
via hypotheses.org
Wait, hold on. There’s nothing fortunate about that.
When you can buy mass-produced stuff, and use machines, you do a lot less work — particularly when you’re not working.
That’s not to say we’re all currently living the best lives possible. Life could be better. But to convince people life could be better, we don’t have to lie about some bygone Eden.
You know what else is a good argument for improving stuff, besides telling people the world was better before? Telling people the world was worse before. Because if something became better, it can get better still.
Follow Ryan Menezes on Twitter for more stuff no one should see.