My sophomore year in college, I took an Introduction to Humanities course (which I enjoyed very much), and was pleased when I walked into class one day and found that my physics professor was giving a guest lecture.
“Doc,” as we called him, was a major science wonk. He held PhDs in physics and mathematics and taught courses in both disciplines, as well as freshman chemistry. No matter what the class, his lectures frequently took unexpected side trips into anthropology, biology, astronomy – well, heck, you name it, and he knew it. Probably the only other person I’ve met who reads as many books per year is Dr. Mike Eades.
Doc’s guest lecture was about the need for general scientific literacy, and he told us at one point, “No matter what field you plan to go into, learn math. Math is how you know when you’re being lied to.”
I agree wholeheartedly. Trouble is, math just plain scares a lot of people. Stick a couple of x’s and y’s into an equation, and they get a case of brain-freeze that can otherwise be produced only by gulping a Slurpee.
After researching Fat Head, I’m convinced most reporters are prone to Slurpee Syndrome. When they report on this-or-that new study, they don’t want to strain their brains by poring over the actual data. So they just read the abstract and the author’s conclusion, then write the story. That’s how a lot of bad science becomes embedded in our consciousness.
There are notable exceptions, of course. Gary Taubes thoroughly analyzes the hard data – but Gary has a degree in physics from Harvard. That’s why I found it laughable when some media-darling doctors claimed his hypotheses about adaptive thermogenesis and homeostasis would violate the laws of thermodynamics. Yeah, right … the guy with the degree in physics forgot all about the laws of thermodynamics. Luckily for us, there were actual doctors on hand to set the record straight.
To restate Doc’s warning – with a little Mark Twain stirred in – if you can’t do the math, researchers with an agenda will use lies, damned lies and statistics to bamboozle you. Let’s look at how they do it.
Just flat-out lie about the results. Thirty years after the start of the famous Framingham study, the authors of a new report stated that “The most important overall finding is the emergence of the total cholesterol concentration as a risk factor for coronary heart disease in the elderly.” In plain English: high cholesterol kills old people.
Just one little problem: the hard data showed no correlation at all between heart disease and cholesterol in the elderly, as Dr. Uffe Ravnskov pointed out in his excellent book “The Cholesterol Myths.” If you look at the actual data points, they’re all over the place.
Perhaps hoping to clarify the issue, the American Heart Association put actual numbers on the claim: “The results of the Framingham study indicate that a 1% reduction of cholesterol corresponds to a 2% reduction in CHD risk.” But when Dr. Ravnskov crunched the data, he found that the Framingham subjects whose cholesterol decreased over time actually had a higher rate of heart disease, not a lower one.
With this apparent inability to recognize if numbers are going up or down, I respectfully suggest that the Framingham researchers resign their positions and go work for a congressional budget committee. (I also respectfully suggest that reporters who couldn’t be bothered with examining the data start covering school-board meetings.)
Scare people with percentages. When you see a percentage, you’re looking at the results of multiplication or division. But when you see the word “difference,” you are – if the researcher is honest – looking at simple subtraction. If a value goes from 20 to 22, it’s an increase of 10%, but the difference is 2. (Still with me? Good; you have a functioning brain.)
Multiplication and division can produce big, impressive-sounding percentages that are in fact nearly meaningless. Here’s an example that helped enshrine the “cholesterol kills” theory:
After a major study with the acronym MRFIT was concluded, the researchers announced that people with high cholesterol were over 400% more likely to die of heart disease. Ohmigosh!! Get me into an Ornish program, now! I must reduce my cholesterol!
That’s a big, scary number. Let’s see how they came up with it.
Over the course of the study, 0.3% of the men whose cholesterol was below 170 died from heart disease. Meanwhile, 1.3% of the men whose cholesterol was over 265 died of heart disease. Over 265?! Dead man walking! Buy your casket now and save!
And in fact, since 1.3/0.3 = 4.33, you could say the relative risk is 400%.
Now flip the numbers and look at the actual difference. In the low cholesterol group, 99.7% did not die from a heart attack. Among the very high cholesterol group, 98.7% did not die from a heart attack. That’s a difference of 1.0%. In other words, if you go up the scale from low cholesterol to very high cholesterol (nearly 100 points higher), the real difference is that an extra 1 in 100 men died of heart disease. Not quite such a scary number, is it?
Wow people with percentages. Percentages work in the other direction, too. You’ve probably seen the Lipitor ads where Pfizer announces that this wonder drug reduces heart attacks by 36%. That sure sounds impressive … until you look at the actual difference.
In the study cited by Pfizer, men with known risk factors for heart disease took either Lipitor or a placebo. In the placebo group, barely more than 3% had a heart attack. In the Lipitor group, 2% had a heart attack. Use division, and you get that impressive 36% reduction. But the difference, once again, is 1 in 100, or 1%. Boy, that’s worth giving your liver a major smack-down.
And by the way, the difference in the heart-attack rate for women who take statins and women who don’t is: zero. You can multiply that difference, divide it, square it, triangle it, stick it inside a trapezoid, whatever … you still can’t come up with a reason for women to take statins – ever.
Count only the numbers you like. (Otherwise known as “cherry picking” your data.) In trials conducted on statins, researchers will happily announce that fewer people died of heart disease. (Not a whole lot fewer … see above.) But they’re curiously silent about how many people died overall. In fact, they refuse to release what’s called the “all-cause mortality” data. Gee, I wonder why?
If I conduct a five-year study in which I give one group a daily glass of orange juice, and another group a daily glass of rat poison, I can guarantee you that far fewer people in the rat-poison group will die from heart disease. But I doubt the all-cause mortality numbers would help sell rat poison as a new wonder drug.
(And in fact, studies of the first cholesterol-lowering drugs were stopped because too many subject were dying of cancer … but at least they didn’t get heart disease.)
Here’s another example of cherry-picking data: Awhile back, newspaper headlines were practically screaming that smoking cigars is nearly as dangerous as smoking cigarettes – you know, high rates of cancer and all that. The American Cancer Society jumped all over the report. And since I enjoy two or three cigars per week while taking my five-mile walks, this one grabbed my interest.
As it turns out, the researchers compiled their data from men who smoke five or more cigars per day. Now, a good cigar can easily take an hour to smoke. So unless these guys were standing outside half the day, they were lighting up indoors and inhaling a lot of smoke. And I’m pretty sure anyone who smokes five cigars a day isn’t exactly a health nut. Lord only knows what else these guys put into their bodies.
Most cigars smokers don’t even average one per day. When other researchers compared men who smoke one cigar per day with nonsmokers, the difference in cancer rates was insignificant. (The cigar smokers were also more likely to have Austrian accents and be elected governor of California.)
Confound it! In a real, true, worthwhile study, you compare large groups of people who are statistically identical except for one variable: one group is taking a drug, or adding fiber to their diets, or meditating while listening to Yanni music. Everything else should be the same.
If everything else isn’t the same, you’ve got confounding variables, which means your study data is worthless. You can’t just compare x. You’ve got to compare x while making sure that a, b, c, y and z are virtually identical.
Dean Ornish claims a lowfat diet prevents heart disease. Why? Because he put people on his diet, and by gosh, they had fewer heart attacks than the control group. But Ornish also had the study group stop smoking, start exercising, and take classes in stress management. That’s pretty convenient, considering that smoking and stress are two of the biggest causes of heart disease.
Call me crazy, but I’m pretty sure if I took a group of smoking, stressed-out couch potatoes and had them stop smoking, start exercising, meditate while listening to Yanni, and take up chewing tobacco, their rate of heart disease would plummet. Then I could claim that chewing tobacco prevents heart disease. Heck, I’d probably get a lifetime pass to NASCAR races.
So the next time you see a newspaper headline announcing that some new study “proves” this or that will kill you – or save you – ask yourself a few questions: What exactly did they count? What didn’t they count? Are we looking at a percentage change, and if so, how did they calculate it? What’s the real difference? Did they control their variables? And – most importantly – do the raw numbers actually support the conclusion?
And Doc, if you’re still alive, I want you to know you were one of the best teachers I ever had, at any level. You told me once I was good at math and should look into programming computers for a living, which I thought was a silly idea at the time. Now it’s how I pay the mortgage. It’s also how I paid for Fat Head.
You were even more brilliant than I thought.
If you enjoy my posts, please consider a small donation to the Fat Head Kids GoFundMe campaign.