Pretty funny, eh? But according to the latest health scare, our bacon-loving lady could be on her way to an early grave. Here’s the headline:
Oh my gosh! I eat a lot of animal fat … I can feel my pancreas swelling up with tumors as I write. I’ve been issued a death sentence, and I know it’s accurate because – hold onto your seats, now – the article included the magic words study finds right there in the sub-headline.
And what an amazing study this has turned out to be. So far it has indicated that being overweight in middle age will kill you, a lack of physical activity can increase your odds of breast cancer, red meat will give you colon cancer, alcohol can lead to pancreatic cancer and fruits and vegetables may protect against lung cancer … uh, but only in men. The study also achieved the amazing feat of indicating that dietary fat may lead to breast cancer – but red meat doesn’t.
Considering how many headlines this study has already produced – with more sure to follow – I’m going to suggest you memorize the name: The NIH-AARP Diet and Health Study. I’m also going to suggest that when you spot an article that cites this study, you bookmark it, download it, print it, and then use the pages to paper-train a puppy.
NIHARP (my shorthand) is one of those big, expensive studies that enables researchers to analyze data, publish research papers, give speeches, and otherwise pay their mortgages for years without ever seeking another grant. In fact, as the media likes to repeat over and over, this is THE LARGEST STUDY OF ITS KIND.
Wow, that must mean we’re looking at some rock-solid science here, right? Hardly.
Because NIHARP is typical in many ways of the studies that scared people away from fat, it’s worth taking a closer look. I downloaded quite a few study documents, including the original food survey, and I’ll try to explain the weaknesses of studies like this while keeping the statistical geek-speak to a bare minimum.
My girls have recently become huge fans of The Sound of Music. So, as the song says, let’s start from the very beginning …
Throughout 1995 and 1996, the investigators mailed a food-frequency questionnaire to 3.5 million members of the American Association of Retired Persons, all aged 50 to 69, who lived in six states (California, Florida, Pennsylvania, New Jersey, North Carolina, Louisiana), plus two metro areas (Detroit and Atlanta.) The authors said they chose these areas because they have high concentrations of retired people. I’m guessing that if people retired in California or Florida, it was for the weather, whereas if they retired in Detroit, they couldn’t afford to move.
Here’s the first big problem with the study (the largest of its kind!): the survey itself. In order to determine what people eat, the investigators sent them a list of 120 foods and asked them to answer questions like this:
Over the last 12 months, how often did you eat the following foods? (Ignore any recent changes.)
Whole milk (4%), NOT in coffee, NOT on cereal: Never | 1-6 per year | 7-11 per year | 1 per month | 2-3 per month | 1-2 per week | 3-4 per week | 5-6 per week | 1 per day | 2-3 per day | 4-5 per day | 6+ per day. Portion size: less than ½ cup | ½ to 1 cup | more than 1 cup.
Breads or dinner rolls, NOT INCLUDING ON SANDWICHES: Never | 1-6 per year | 7-11 per year | 1 per month | 2-3 per month | 1-2 per week | 3-4 per week | 5-6 per week | 1 per day | 2-3 per day | 4-5 per day | 6+ per day. Portion size: less than 1 slice or roll | 1 or 2 slices or rolls | more than 2 slices or rolls.
Mayonnaise or mayonnaise-like salad dressing on bread: Never | 1-6 per year | 7-11 per year | 1 per month | 2-3 per month | 1-2 per week | 3-4 per week | 5-6 per week | 1 per day | 2-3 per day | 4-5 per day | 6+ per day. Portion size: less than 1 teaspoon | 1 to 3 teaspoons | more than 3 teaspoons.
Ground beef in mixtures such as tacos, burritos, meatballs, casseroles, chili, meatloaf: Never | 1-6 per year | 7-11 per year | 1 per month | 2-3 per month | 1-2 per week | 3-4 per week | 5-6 per week | 1 per day | 2-3 per day | 4-5 per day | 6+ per day. Portion size: less than 3 ounces | 3 to 7 ounces | more than 7 ounces.
Could you answer a survey like that accurately? I couldn’t. In fact, I didn’t. When I was working for the National Safety Council, some genius in management decided everyone in the company should fill out a survey like this one. On a whole lot of the questions, I needed a box labeled “I have no freakin’ idea.” But there wasn’t one. So I did what all my pals at work did: I guessed.
And I was 25 years old, not 65. My memory was sharp then and it still is, but I couldn’t tell you what I ate last Tuesday, never mind last February. Of the nearly 3 million people who received the NIHARP survey but didn’t return it, how many do you suppose looked it and mumbled, “I have no freakin’ idea,” then tossed it in the trash?
But around 600,000 people did return the survey, which leads to the second problem: this is a self-selected group that doesn’t mirror the general population.
In the baseline data, it’s obvious that compared to the general population, the survey group is far more likely to be white (over 90 percent), well educated, and non-smoking. The authors admitted they were concerned about the low response rate (about 17 percent), but managed to discern that “a shifting and widening of the intake distributions among respondents compensated for the less-than-anticipated response rate.”
In other words, they declared this cross-section of the population varied enough for a study and decided to keep going. (Gotta pay that mortgage, you know.)
Here’s the third problem: the self-selected group was winnowed down even further by the investigators. Yes, it’s common practice to try to dump incomplete or suspicious data, but in explaining how they determined if a survey was sufficiently complete, they stated, “In calculating our initial cohort sample size of 350,000 we focused on a single nutrient, dietary fat.”
Hmmm … sounds to me like they already had an opinion about which nutrient would wind up being linked to cancer. If they could determine how much fat you ate, you were in. Why fat? Why not sugar, or white flour, or corn flakes?
Nearly ten years after the first survey, the authors mailed a similar questionnaire, along with others that asked about exercise, smoking and medications. Then they compared the respondents’ diets with their rates of various diseases, focusing primarily on cancer. That’s where they came up with all the crunchable numbers.
So how well do numbers like these crunch? That’s the fourth big problem: they don’t crunch very well. They’re more on the squishy side. In one of their many papers, here’s how the researchers evaluated the accuracy of their own food-intake data:
For the 26 nutrient constituents examined, estimated correlations with true intake (not energy-adjusted) ranged from 0.22 to 0.67 … When adjusted for reported energy intake, performance improved; estimated correlations with true intake ranged from 0.36 to 0.76.
So what does that statement mean? Here’s what a site that explains statistics in plain English has to say about correlation:
Correlations of less than 0.1 are as good as garbage. The correlation shown, 0.9, is very strong. Correlations have to be this good before you can talk about accurately predicting the Y value from the X value.
If you want to think of it visually, a correlation of 1.0 gives you a perfect trendline: if smoking absolutely, positively causes lung cancer and is absolutely, positively dose-dependent, then you could plot the number of cigarettes smoked per day against the incidence of lung cancer, and you’d get one of those lines that starts at zero in the lower left and zooms straight to the upper-right corner.
But for this study, the estimated correlation (after being adjusted upwards) is between 0.36 and 0.76. In other words, the investigators themselves estimate that the accuracy of their food survey is somewhere between lousy and decent. Well, decent might be stretching it. The same analysis of their own study included this statement:
However, previous biomarker-based studies suggest that, due to correlation of errors in FFQs and self-report reference instruments such as the 24HR, the correlations and attenuation factors observed in most calibration studies, including ours, tend to overestimate FFQ performance.
So the lousy-to-decent estimate might be overestimated. Kudos to them for saying as much. And yet from this data, they’re going to look for correlations between diets and diseases and write a slew of research papers on what they find.
Which brings us to the fifth big problem: the associations you find when looking at data depend largely on the associations you seek. In a study like this, you gather a huge amount of data, then you ask the data some questions. How you ask the question affects the answer.
Some months ago, the researchers asked this data if there was an association between red meat and colon cancer, and wouldn’t you know it, the data answered “yes.” At least that’s the story that made the headlines. But the truth is, the question they asked went more like this: “Do people who eat a lot of steaks, hot dogs, hamburgers, sausage, pizza, cold cuts, bacon and deli sandwiches have a higher rate of colon cancer?”
Grouping all those foods together under the label “red meat” confounds the question – and it wasn’t necessary to confound the question. In the food survey, “steaks” is a separate item. If you really want to know if red meat causes cancer, why not simply ask, “Do people who eat a lot steaks have a higher rate of colon cancer?” Maybe they did ask that question. Maybe they didn’t like the answer, so they asked it again and included pizza and hot dogs.
Here’s another strange grouping: the food survey lumped butter and margarine together as a single food item. I nearly jumped out of my skin when I read that one. Talk about confounding the data! Butter is natural. Margarine is a processed frankenfood. The only similarity is that people spread them on toast. You may as well lump cigarettes and carrot sticks together because they have the same shape.
Even when researchers ask well-designed questions, there’s the “don’t ask, don’t tell” problem: there may be associations lurking in the data that no one is looking for. When Ancel Keys cherry-picked six countries and went looking for an association between fat and heart disease, he found it. But the same overall data showed a much stronger association between sugar and heart disease … and an even stronger association between television ownership and heart disease.
Which brings us to the sixth problem: Associations are only useful for providing clues. They don’t identify the cause. There’s a strong association between obesity and type II diabetes. Does that mean being fat causes diabetes? Nope. It could mean diabetes makes you fat. Or, more likely, it could mean obesity and diabetes are both caused by excess insulin. You get the idea.
Considering that the animal fat will kill you! message has been around for more than 30 years, it’s highly likely that people who eat a lot of pizza, hot dogs, hamburgers, bacon, sausage and deli sandwiches are the “non-adhering” types Dr. Mike Eades wrote about awhile back. (Or, as I call them, “people who don’t give a @#$%.”)
Those same people may also consume more sugar, more white flour, more high-fructose corn syrup, more cough syrup, etc. – which is not much of a stretch, when you consider that pizza, hot dogs, hamburgers and deli sandwiches are all served with a load of starch. But as far as I can tell, the NIHARP investigators aren’t asking questions about sugar and starch. So far, they seem interested in discovering that animal fat is dangerous, while fruits and vegetables will save your life.
The next time you see yet another paper from this study (the largest of its kind!) generate yet another round of alarmist headlines about the possible dangers of animal fats (and you will), keep this in mind about The NIH-AARP Diet and Health Study:
What we’re looking at is 1) a survey study with a low response rate that 2) required old people to accurately recall what they’d eaten in the past year (twice), which then provided data that is 3) almost certainly polluted by self-selection and confounding variables, and is 4) being analyzed by researchers who indicated from the beginning that their main concern is dietary fat, all for the purpose of 5) identifying associations, which don’t tell us very much anyway.
Other than that, it’s a fine piece of work. Now go fry up some bacon, and don’t worry about your pancreas. But try to avoid throwing the pan out the window.
(Hat tip to Mike Eades for Twittering the video. I nearly did a spit-take with my coffee.)
If you enjoy my posts, please consider a small donation to the Fat Head Kids GoFundMe campaign.