Archive for the “Media Misinformation” Category
Here’s part of what I wrote in a recent post titled Body Types and Brains:
I remember one of my roommates in college looking at the single spiral notebook I took to all my classes and saying, “That’s all the notes you take? How the heck are you getting A’s in everything? You hardly write anything down!”
“Uh, well,” I mumbled, “if the professor says something and it makes sense, I just remember it. I don’t really have to write much of it down.”
That particular roommate was a party animal. I partied right along with him, but only on Thursdays (dollar pitcher night), Fridays (quarter beer night) and Saturdays (student parties all over campus and near-campus).
I had another roommate for half of my senior year who was a studying machine. He not only took copious notes in class, he’d rewrite them all in neat penmanship later. The notes he took in class were “too messy,” you see. He attended the occasional party, but never drank much. We graduated with identical grade-point averages: just a fraction under a perfect 4.0/4.0, which means we both got a B in a class at some point. I remember him once asking me, “Why is it that you can party like [the first roommate] and then get the same grades I do?”
“Uh … I don’t know,” I said. “I channel both of you?”
Actually, I explained why I got those grades in the Body Types and Brains post:
I got those grades largely because I’m a “brain mesomorph,” so to speak. Brain mesomorphs can pick pretty much any method of studying and still do well, as long as they don’t do something to screw up that genetic gift – like, say, don’t study at all.
Well, I’ve changed my mind. I now believe I got (almost) straight A’s because I had the discipline to take a few notes in class, study a bit to master the material, turn in my papers on time, and limit my heavy beer-drinking to three nights per week. In fact, I think everyone could pull straight A’s in college if they were just willing to do the same.
To prove my theory, I’m going to re-enroll in college as a one-man experiment. This time around, I’ll drink copious amounts of beer six nights per week, skip the note-taking entirely, not bother studying, and turn in half-assed first drafts of my papers a week late. I suspect this will lead to no better than a C average, perhaps even worse.
If that’s the result, I’ll announce that I’ve proved my theory: anyone who doesn’t do extremely well in college simply isn’t willing to take a few notes, study a bit, and limit the partying to no more than three nights per week. Those B and C students have no one to blame but themselves.
Say what? You think my theory is bogus and my experiment is stupid?
Yes, of course it is. Academic achievement was easy for me, and screwing up on purpose to get average grades proves absolutely nothing about why other people get average grades.
As part of an extra-credit program in high school, I tutored another student who was struggling with freshman algebra. (I was a junior, which means I was taking trigonometry at the time.) This kid certainly put out the effort – more than I ever had to – but had a difficult time wrapping his brain around mathematical concepts. I felt sorry for him … because even at age 17, I had enough common sense not to blame people for being less than genetically gifted.
Unlike this nincompoop:
A woman who intentionally gained 50 pounds wants to demonstrate a point she believes about overweight people: They have only themselves to blame for being heavy.
“People have always said to me, all of my life, ‘You’re lucky to be skinny,’ and what I wanted to prove was that there are no excuses for being overweight,” British reality star Katie Hopkins told TODAY.
Ahh, I see. You’ve always been skinny, so of course you know all about what causes obesity. Are you by any chance related to MeMe Roth? Your “before” picture suggests as much:
Hmmm, maybe you should get in touch with Heath Squier of Julian Bakery and ask him how to puff out your belly to look a teensy bit fat, then claim you were 35 pounds heavier. Anyway …
Known across the pond for her acerbic, outspoken comments, Hopkins created a Twitter frenzy when she declared on a British talk show: “I don’t believe you can be fat and happy. I think that’s just a cop out.”
Critics immediately accused Hopkins of “fat shaming” and failing to understand the psychological, as well as physical, factors behind weight gain.
Hopkins then fought back against those who called her ignorant and wrong by eating. A lot. She consumed 6,500 calories every day by stuffing herself with calorie-rich burgers, fries, pasta and cupcakes, recording everything in a food journal. At times, she brought herself to tears because of how much she ate.
“I didn’t cry at childbirth. I didn’t cry at my wedding, but I cried over this because I was just so disgusting,” she said.
So to gain weight, you had to stuff yourself with 6,500 calories per day and eat until you were disgusted and in tears – in other words, waaaaay beyond what your appetite would dictate – just like all fat people do. Geez, and to think some critics actually doubted you understand the physical factors behind weight gain.
Hopkins admits the next step of her experiment has proved to be much more difficult. She’s committed to losing the 50 pounds she gained within three months. She has drastically changed her diet and upped her exercise level, all to prove that being thin is as simple as eating less and moving more.
So it’s a simple matter of eating less and moving more! Well, hell, why didn’t anyone ever tell me that during all those years I was making myself ravenous on low-calorie, low-fat diets and spending hours and hours on a treadmill? Clearly I didn’t try hard enough.
“I’ve learned a lot about how it feels to be big, how difficult it is to be big, how horrible it is to have fat sitting on the top of your thighs, and how much more challenging it is just to do everyday life when you’re bigger,” she said.
Hopkins said she still has 35 pounds left to lose in the next two months.
And I bet she’ll do it – because she’s been skinny her whole life and that’s the shape her body will want to resume. (Simple math says she already lost 15 pounds in the first month; i.e., nearly four pounds per week.) To quote again from my Body Types and Brains post:
Mesomorphs look well built without setting foot in a gym … Yup. I’ve known people like that. In order to stay lean and muscular, all they really have to do is not screw up.
So this naturally-thin bubblehead screwed up on purpose by jamming 6,500 calories per day of junk food down her throat, thus overwhelming her body’s resistance to gaining weight, and by gosh, she got fat. So that means anyone who’s fat must be screwing up just like she did. Uh-huh … and if I go back to college and party away all my evenings instead of studying and then wind up with average grades as a result, that means anyone who gets C’s in college is a screw-up who parties too much. Same (ahem) logic.
Ms. Hopkins, you were born on the metabolic finish line and think you won a race. Not only that, you think you’re an expert on how the race is won – because you tied your ankles together and proved how difficult it is to run a race in that condition.
What you actually proved is that you’re a flippin’ moron.
Whoops … there I go, making judgments about someone born with a low I.Q.
Sorry about that.
63 Comments »
As you’ve probably heard by now, Tom Hanks recently announced that he has type 2 diabetes. Since Hanks has gained and lost weight for various movie roles, doctors quoted in media articles of course blamed his diabetes on “extreme weight fluctuations.” I’m reasonably sure the doctors weren’t suggesting Hanks developed diabetes by becoming a skinny guy for his roles in Philadelphia and Castaway, so what they’re saying is that he developed diabetes by getting fatter for other roles.
Maybe, maybe not. I vote not. His weight may have fluctuated for various movie roles, but I’ve never seen him on screen on thought, “Wow, Tom Hanks is really getting fat.”
According to what I could find online, Hanks beefed up to 225 pounds for his role as a former baseball player in A League of Their Own. That may sound like a lot of weight to carry around, but take a look at a picture of him from that film:
Sure, he’s got some thickness around the belly there, but that’s all it is: some thickness around the belly. We’re not looking at what I’d call an obese guy in that picture. I wouldn’t even call him a fat guy. He looks like what he was portraying in that film: an ex-jock who’s gotten a bit soft. Despite coming across on camera as a guy with a long-and-lean body type, Tom Hanks has more muscle on him than you might think. Here he is again in Castaway, when his character first landed on the island:
Once again, he’s got a belly … but look at the thickness of his arms and legs. His calves are nearly the size of my thighs. There’s a lot of weight in those legs. When I saw that film in a theater and there was a scene of him dancing around to celebrate making fire, I remember being impressed with the size of his leg muscles.
If you’ve seen Castaway (good movie), you know that partway through the film, we suddenly jump ahead in time and are shocked to see a totally ripped Tom Hanks – now as a guy who’s been barely surviving on fish and coconuts for years:
If you’d asked me at the time to guess how much he weighed while shooting that section of the film, I would have said 150 pounds. But Hanks weighed 170 pounds in those scenes. He’s six feet tall, so his BMI at that point was just over 23 – in the middle of the “normal” range, despite having almost no body fat. So once again, we’re talking about a guy who is heavier than he looks, thanks to surprisingly thick muscles.
I don’t think temporarily weighing 225 pounds for a film shot in 2000 is the reason he has diabetes in 2013. According to an article on CNN, here’s what his doctor told him:
“I went to the doctor and he said, ‘You know those high blood sugar numbers you’ve been dealing with since you were 36? Well, you’ve graduated,’ ” Hanks told Letterman. ” ‘You’ve got Type 2 diabetes, young man.’ “
He’s had high blood sugar since age 36? Hanks is 57 now. Around age 36, he was shooting Sleepless in Seattle, Philadelphia and Forrest Gump. Those were all after A League of Their Own, and he was a lean guy in all of them – positively skinny, in fact, for his role as a lawyer with AIDS in Philadelphia.
So clearly it’s possible to be quite lean and still have chronically high blood sugar, which is what leads to type 2 diabetes. Here’s another quote from the CNN article:
The “Forrest Gump” star said his condition is manageable through diet, and Letterman said that he too has high blood sugar.
So David Letterman also has high blood sugar, which would classify him as pre-diabetic. Here’s a picture of him from about a year ago:
Does that look like a fat guy to you? Has David Letterman gone through “extreme weight fluctuations” as part of his career? I don’t think so. That’s why I’m opposed to the CDC, the USDA, the AMA and pretty much all the other health “experts” obsessing about how much people weigh with all their talk about the obesity epidemic. The real epidemic is the number of people walking around with chronically high blood sugar.
I’m happy to report that at least one major media outlet didn’t jump on the “he got diabetes because he was fatter in two movie roles” bandwagon. Here’s a quote from the U.K. Telegraph:
But the link with diabetes isn’t as clear as Hanks, and the doctors who have been wheeled out stateside to support his theory, have made out.
While some studies have suggested that yo-yo dieting might destabilise metabolism and lead to chronic weight gain, with increased risk of heart disease or diabetes, others have been inconclusive. Studies in animals have shown that yo-yo dieting is far better for the body than remaining obese.
Even the link between obesity and Type 2 diabetes isn’t as clear cut as many make out. There are many of relatively normal weight who go on to develop the disease, suggesting that in some cases it can be just “one of those things”.
From his description, it sounds as though Tom Hanks had impaired glucose tolerance (pre-diabetes) for years.
Including long stretches during which he was a lean guy.
I’m glad to hear Hanks plans to manage his diabetes through diet – but I hope it’s not the diet recommended by the American Diabetes Association. If it is, someday we’ll be reading more articles about Tom Hanks suffering from the effects of diabetes. I’d much rather read reviews of many more masterful acting performances yet to come.
37 Comments »
Thank goodness for Health.com. According to an article I read online today, some of my dietary habits are draining me of energy. Let’s take a look:
Who doesn’t wish for more energy at least a few dozen times a day?
I don’t. (I hope I’m not alone in that regard. If most people are wishing they had more energy two to three times every hour they’re awake, those zombie movies aren’t as far-fetched as I thought.)
Of course, you know that a good night’s sleep, regular exercise, and effective stress management can give you a much-needed boost. But to further figure out why you’re slumping, you need to pinpoint the energy-sucks in your diet. (Hint: Those low-carb meals aren’t doing you any favors.)
Dangit! And here I thought my energy level was pretty high for a guy coming up on his 55th birthday. During the daylight hours last weekend, I spent my time sawing logs, tossing the sawed logs aside to saw more logs, and weed-whacking my way through some briar. After the sun went down, I programmed some updates to a software package I sell to law firms. Oh, and I also played 72 holes of disc golf while taking work breaks from the logs. Now that I know I did all that in an energy-depleted state, I feel kind of foolish.
Anyway, here are the energy-draining mistakes Health.com says I may be making:
You go long stretches without eating
Guilty as charged.
Food Fix: Snack early, snack often
Every time you go more than two hours or so without eating, your blood sugar drops — and that’s bad news for your energy.
Hmmm … as I write, it’s been six hours since my last meal. So out of curiosity, I pulled the glucose meter out of my desk drawer and checked my blood sugar. It’s 90 mg/dl. I’m pretty sure that’s not considered low. Once or twice per week, I do a 24-hour intermittent fast – dinner one day to dinner the next. I’ve checked my glucose at the 23-hour mark. It’s always in the 80-90 mg/dl range. So I’m thinking if your blood sugar drops to the point where you feel drained just two hours after a meal, it may have something to do with what you eat.
Food supplies the body with glucose, a type of sugar carried in the bloodstream. Our cells use glucose to make the body’s prime energy transporter, adenosine triphosphate (ATP). Your brain needs it. Your muscles need it. Every cell in your body needs it.
Time to dig out the books on metabolism again. I was under the impression most of the cells in our bodies can also burn fatty acids or ketones for fuel.
But when blood sugar drops, your cells don’t have the raw materials to make ATP. And then? Everything starts to slow down. You get tired, hungry, irritable and unfocused.
Tired, hungry, irritable, unfocused … yes, I remember that feeling. I experienced it rather often when I was on a low-fat diet and depended on regular infusions of carbohydrates to keep my blood sugar up. Back in those days, I would have been a sucker for advice such as:
Grab a bite every two to four hours to keep blood sugar steady.
I had to take a couple of business calls and answer some emails while writing, so now it’s going on seven hours since my last meal. According to Health.com, that means I’m at least three hours overdue for a snack . I’d better check my blood sugar again. Hang on a second …
… Uh-oh. My glucose has plummeted to 89. Anyway, on to the next mistake and fix.
Your breakfast is too “white bread”
Energy, thine enemy is a sugary breakfast: pancakes, white toast, muffins and the like. Instead, start your day with soluble fiber (found in oatmeal, barley and nuts).
“It dissolves in the intestinal tract and creates a filter that slows the absorption of sugars and fats,” explains Dr. David Katz, founder of the Yale Prevention Research Center and author of “Disease Proof.”
In fact, research shows that choosing a breakfast with either soluble fiber or insoluble fiber — the kind in whole-grain breads and waffles — actually protects against blood sugar spikes and crashes later in the day.
Well, there’s my problem. I don’t eat whole-grain breads or waffles for breakfast. If I eat breakfast at all, it’s eggs and some kind of meat. But I often skip breakfast because I’m just not hungry. Part of the reason I’m not hungry is that my glucose is always in the 80-90 range when I wake up. Since Health.com has informed me that going without eating for more than four hours will cause low glucose, I’m considering setting up the video camera in our kitchen so I can catch myself raiding the refrigerator while sleep-walking.
A smart start: cereals with at least 5 grams of fiber a serving and whole-grain breads with 2g per slice.
Yeah, start your day with cereal or bread. Then grab a snack within the next two to four hours, because your blood sugar will be dropping. I wonder if there’s a connection?
The next two mistakes the article lists are eating the wrong kinds of vegetables and avoiding red meat entirely. No complaints there. But here’s the final mistake and suggested fix:
You’ve cut one too many carbs
Food Fix: Hello, whole-wheat pasta and potatoes!
Carbs help your body burn fat without depleting muscle stores for energy.
So if you keep raising your glucose every two to four hours so every cell in your body can burn glucose for energy without even tapping your glycogen stores, your body ends up burning fat. Makes sense.
The ideal diet is 50 to 55% complex carbohydrates, 20 to 25% protein and 25% fat.
In a Tufts University study, women on a carbs-restricted diet did worse on memory-based tasks compared with women who cut calories but not carbs. And when the low-carb group introduced them back into their diet, their cognitive skills leveled out.
I see. So here’s the advice in a nutshell:
- Start your day with cereal, bread or waffles.
- When your blood sugar plummets two hours or so after eating the cereal, bread or waffles, have a snack to raise your blood sugar.
- When your blood sugar drops two hours or so after eating the snack that raised your blood sugar, have another snack to raise your blood sugar. By constantly raising your blood sugar to make sure you burn glucose for fuel, you end up burning fat.
- Repeat steps 2 and 3 until your next meal — which should be 50-55% carbohydrates to make sure your body produces enough blood sugar.
- When your blood sugar drops two hours or so after eating a meal that’s 50-55% carbohydrate (to make sure you produce enough blood sugar), have a snack to raise your blood sugar.
- Research at Tufts University shows that after conditioning yourself to require a carbohydrate snack every two hours or so to keep your blood sugar from plummeting, cutting back on carbohydrates will cause your blood sugar to plummet — which means you’ll do worse on memory-based tasks. So don’t cut back on the carbohydrates.
- If you accidentally forget to eat carbohydrates every two hours or so and your blood sugar plummets and causes you to do worse on memory-based tasks, eat more carbohydrates to raise your blood sugar and level out your cognitive abilities. But don’t forget to have a snack two hours later to raise your blood sugar after it starts dropping, or you’ll become stupid again.
That advice makes no sense to me. But that’s probably because it’s now been seven hours since my last meal, and my glucose has plummeted to 89 mg/dl.
p.s. – After I wrote this post, we had dinner: a chef salad with lettuce, onions, eggs, cauliflower, bacon, bits of cheddar cheese, tomatoes from the garden, Italian sausage chunks and a bacon grease/white wine vinegar dressing. My glucose an hour later is 105 mg/dl. I don’t expect to need a snack two hours from now.
57 Comments »
It’s not exactly diet-related, but how’s this for a classic case of confusing correlation with causation? An article on the NBC News site reported on a study of what people were drinking before ending up in an emergency room:
Many people who end their Friday or Saturday nights in a hospital emergency room have been drinking alcohol. In fact, public health experts estimate that about one-third of all injury-related ER visits involved alcohol consumption.
I consider that good news. It means if you avoid getting @#$%-faced, you’re less likely to end up in an emergency room. Better choices, better results.
But what, exactly, are people drinking? What types of alcohol and even what brands? Is there a direct link between advertising and marketing and later injury?
I’m already convinced there’s a direct link between advertising and marketing and later injury. I can’t tell you how many drunk people I’ve seen collide with billboards. Good thing most of them were walking.
Until now, those questions have been unanswerable, frustrating alcohol epidemiology researchers.
Sounds to me as if those alcohol epidemiology researchers are easily frustrated.
“Honey, what’s wrong? Why are you slamming the drawers in your file cabinet so hard?”
“Because, dangit, I can’t determine if there’s a direct link between alcohol advertising and later injury! It’s driving me nuts! Make me a martini, will you?”
But if results of a pilot study conducted by researchers from Johns Hopkins Bloomberg School of Public Health hold up, there may soon be a way to connect the dots.
Whenever media health reporters write about connecting the dots, I brace myself for a head-bang-on-desk moment. You may want to get out the desk pad before we continue.
When the Hopkins researchers surveyed ER patients who’d been drinking, they found that Budweiser was the number one brand consumed, followed Steel Reserve Malt Liquor, Colt 45 malt liquor, Bud Ice (another malt liquor), Bud Light, and a discount-priced vodka called Barton’s.
Wait a minute … they went to an emergency room and surveyed drunk people who had injured themselves? I’m surprised they didn’t report the number one brand of alcohol consumed by injured drunks is called @#$% Off!
Though Budweiser has 9.1 percent of the national beer market, it represented 15 percent of the of the E.R. “market.” The disparity was even more pronounced for Steel Reserve. It has only .8 percent of the market nationally, but accounted for 14.7 percent of the E.R. market. In all, Steel Reserve, Colt 45, Bud Ice, and another malt liquor, King Cobra, account for only 2.4 percent of the U.S. beer market, but accounted for 46 percent of the beer consumed by E.R. patients.
Before we continue, I feel obligated to remind you I suggested getting out the desk pad. This is your last warning.
“Some products are marketed to certain groups of people in our society,” explained Traci Toomey, the director of the University of Minnesota’s alcohol epidemiology program, who was not involved in the study. Higher-alcohol malt liquor, for example, is heavily advertised in African-American neighborhoods. “So we might want to put some controls on certain products if we find they are tied to greater risk.”
Head. Bang. On. Desk.
We might want to put controls on certain products if they’re tied to higher risk? As if that will mean fewer drunk-person injuries? Genius. Pure genius.
I don’t doubt that Budweiser, Colt 45 and Steel Reserve are tied to greater risk of ending up in the emergency room in poor communities. But it’s not because of the marketing or the higher alcohol content. The reporter (and perhaps the researchers) apparently thinks it works like this:
1. Evil distributors of high-alcohol malt liquors decide to target poor communities with irresistible advertising and marketing campaigns.
2. Swayed by the irresistible marketing, poor people buy malt liquor.
3. Because the malt liquor has a higher alcohol content, poor people accidentally get @#$%-faced.
4. After accidentally getting @#$%-faced, the poor injure themselves because they’re @#$%-faced.
Boy, if only we had some controls on those products. Take away the cheap malt liquor, those people would stay home and play pinochle … perhaps while sipping a fine white wine with a subtle hint of citrus and a color reminiscent of an Autumn sunrise.
Now here’s how it actually works:
1. Poor people decide to get @#$%-faced.
2. Wanting to spend as little of their limited funds as possible to get @#$%-faced, poor people choose cheap beer, cheap malt liquor and cheap vodka, thus getting more bang for their buck.
3. Recognizing that the biggest market for cheap alcohol is in poor neighborhoods, distributors advertise in those neighborhoods, hoping to sway people who have already decided to get @#$%-faced to drink their particular brand when getting @#$%-faced.
Now here’s how it will work if we put some controls on those products:
1. Poor people decide to get @#$%-faced.
2. Thanks to controls instituted by do-gooders, the cheaper alcohols are no longer available.
3. Poor people buy just as much alcohol and get just as @#$%-faced as before, but have less money to spend on things like food, clothes, shoes, gas, entertainment, etc.
I don’t drink beer very often, but when I do, it’s usually Guinness Extra Stout. (Did I sound like the guy in those Dos Equis commercials just now?) The alcohol content (7.5%) is higher than the alcohol content in Colt 45 malt liquor (6%). So why isn’t Guinness Extra Stout tied to more emergency-room visits in urban hospitals? I’m sure you can guess: The stuff isn’t cheap, so it’s not a big seller in poor communities. If Guinness were as cheap as Colt 45, we’d see more poor people getting @#$%-faced on Guinness.
According to the article, the study was conducted at a hospital in Baltimore in a poor, mostly-black neighborhood. The results were predictable and ultimately meaningless. It would have been more interesting if the researchers had gone to an emergency room in Beverly Hills or Martha’s Vineyard and asked injured people what they were drinking. Then the headline would have been something like Martinis, Single-Malt Scotch and White Wine With a Subtle Hint of Citrus Most Popular Among E.R. Injured.
Then we’d need some controls on those products.
50 Comments »
“If you don’t read the newspaper, you’re uninformed. If you read the newspaper, you’re misinformed.” – Mark Twain
This post will be about a magazine, not a newspaper, but close enough.
While digging through my research files over the weekend, I stumbled across a handful of diet and health articles from TIME magazine online. If you’ve ever wondered why people are so confused about diets and calories and weight gain, just poke through old issues of TIME. Despite all appearing in the same publication, the articles contradict each other. Perhaps that’s TIME magazine’s version of objective reporting: don’t just quote both sides in a debate — argue both sides yourself.
Let’s start with a 2012 article titled It’s the Calories, Stupid. That article was about a study conducted by George Bray, who concluded that macronutrients make no difference for weight loss … even though the carbohydrate content was the same in the diets he tested. I wrote about that study in this post, but here’s what TIME magazine had to say:
When it comes to weight gain, it’s all about the calories.
That might seem obvious, but popular diets continue to suggest that lowering or increasing certain dietary components — carbs or protein, say — is the key to weight loss. A clever new study by researchers at Pennington Biomedical Research Center in Baton Rouge shows, however, that it’s not what you eat but how much that matters when it comes to body weight.
Okay, so it’s all about calories, period. Calories in, calories out. Got it. Thanks, TIME magazine.
I found that article so enlightening, I moved on to another titled For Weight Loss Success, Think About When, Not Just What, You Eat:
Timing is everything for losing weight.
Say what? I’m pretty sure you just informed me it’s all about the calories, stupid. George Bray said so.
In a study published in the International Journal of Obesity, the scientists monitored 420 overweight participants on a 20-week weight loss program in Spain. The volunteers were split into two groups: early-eaters and late-eaters. Since lunch is considered the largest meal in Spain–about 40% of the day’s calories are consumed in the mid-day meal–half the participants ate lunch before 3 p.m. while the remainder ate lunch after 3 p.m.
The late-eaters lost less weight overall, and shed pounds at a slower rate than those eating earlier. Those eating lunch later were more likely to skip breakfast or eat fewer calories, while the timing of breakfast and dinner didn’t influence weight loss effectiveness for either group.
Let me get this straight: the late-lunch eaters skipped breakfast and ate fewer calories, but they lost less weight. So now you’re telling me it’s all about the calories, stupid, unless if you consume your calories later in the day. Then you won’t lose as much weight despite eating less. Got it. Thanks, TIME magazine.
I guess at the very least, we can agree that if Americans would just consume fewer calories (preferably at an early lunch), the obesity rate would decline. To confirm that, I read the article titled Americans Are Eating Fewer Calories, So Why Are We Still Obese?
The good news: we’re eating fewer calories. The bad news: that’s not translating into lower obesity rates.
Two federal studies on the amount of calories Americans eat show that we are eating less than we did about a decade ago, and that we’re also limiting the amount of fast food we consume.
But if Americans are eating less fast food overall, why are obesity rates still so high?
Uh … maybe it’s because we’re eating too many of our fewer calories during a late lunch?
As encouraging as the calorie data are, the decreases aren’t significant enough to make a dent in the upward trend of obesity.
I bet you’re about to say it’s because we don’t exercise enough.
Refining that message may require delving deeper in what Americans are eating, and addressing the balance between the amount of calories that we eat and the amount we burn off daily through physical activity.
Pardon the interruption while I gripe about word choice. If you measure something, you quantify it as an amount. If you count something, you quantify it as a number. We don’t measure calories; we count them. It drives me batty when reporters write about the amount of calories we consume instead of the number of calories we consume. Anyway …
And while eating less is a good way to start addressing the obesity epidemic, it may be that slimming the national waistline means we also have to boost the amount of exercise we get every day.
So we’re eating less but not getting any slimmer, and that probably means we don’t exercise as much as we used to. To confirm that suspicion, I checked an article titled You Can Run But You Can’t Hide:
More Americans are exercising more often, but so far, we’re really not losing much weight. According to one researcher: “To tackle obesity, we need to do this. But we probably also need to do more … Just counting on physical activity is not going to be the solution.”
Okay, uh, so … it’s all about the calories, stupid, but when you consume those calories makes a big difference, and we obviously need to consume fewer calories to lose weight, even though Americans are consuming fewer calories without making a dent in obesity rates, but that’s because we don’t exercise enough, even though we’re exercising more than we used to without making a dent in obesity rates
Okay, got it. Thanks, TIME magazine.
47 Comments »
You’ve got to give the anti-meat hysterics credit for their creativity. Since they can’t prove directly that eating meat will kill you, they’ve become quite adept at stringing unrelated results together into what (almost) looks like a chain of causality.
As I explained in my Big Fat Fiasco speech, this technique is referred to as teleoanalysis. In a nutshell, it works like this: we can’t prove that A causes C, but if we can find evidence that A is linked to B and B is linked to C, we’ll go ahead and declare that A causes C.
Teleoanalysis is partly what has kept the Lipid Hypothesis alive. Studies have failed over and over to prove that a high-fat diet causes heart disease – and in fact, low-fat diets have failed to reduce heart disease in clinical trials over and over. So the anti-fat hysterics trotted out a version of teleoanalysis that looks like this:
- High-fat diets (A) raise cholesterol (B)
- Raised cholesterol (B) is associated with heart disease (C)
- Therefore, a high-fat diet must cause heart disease
If this sounds logical to you, consider my own favorite version of teleoanalysis:
- Drinking lots of water (A) causes frequent urination (B)
- Frequent urination (B) is associated with diabetes (C)
- Therefore, drinking lots of water causes diabetes
With that in mind, let’s take a look at yet another Meat Kills! study that’s making a splash in the media. Here are some quotes from a BBC article online:
A chemical found in red meat helps explain why eating too much steak, mince and bacon is bad for the heart, say US scientists.
A study in the journal Nature Medicine showed that carnitine in red meat was broken down by bacteria in the gut.
This kicked off a chain of events that resulted in higher levels of cholesterol and an increased risk of heart disease.
Can you spot the teleoanalysis? Here it is:
- Red meat (A) contains carnitine, which when digested kicks off a chain of events leading to higher cholesterol (B)
- Higher cholesterol (B) is associated with heart disease (C)
- Therefore, red meat causes heart disease
Here’s the abstract for the study referenced in the BBC article:
Intestinal microbiota metabolism of choline and phosphatidylcholine produces trimethylamine (TMA), which is further metabolized to a proatherogenic species, trimethylamine-N-oxide (TMAO). We demonstrate here that metabolism by intestinal microbiota of dietary l-carnitine, a trimethylamine abundant in red meat, also produces TMAO and accelerates atherosclerosis in mice. Omnivorous human subjects produced more TMAO than did vegans or vegetarians following ingestion of l-carnitine through a microbiota-dependent mechanism. The presence of specific bacterial taxa in human feces was associated with both plasma TMAO concentration and dietary status. Plasma l-carnitine levels in subjects undergoing cardiac evaluation (n = 2,595) predicted increased risks for both prevalent cardiovascular disease (CVD) and incident major adverse cardiac events (myocardial infarction, stroke or death), but only among subjects with concurrently high TMAO levels. Chronic dietary l-carnitine supplementation in mice altered cecal microbial composition, markedly enhanced synthesis of TMA and TMAO, and increased atherosclerosis, but this did not occur if intestinal microbiota was concurrently suppressed. In mice with an intact intestinal microbiota, dietary supplementation with TMAO or either carnitine or choline reduced in vivo reverse cholesterol transport. Intestinal microbiota may thus contribute to the well-established link between high levels of red meat consumption and CVD risk.
Allow me to interpret that gobbledygook:
Humans who eat meat have more carnitine-eating bacteria in their guts and therefore produce more TMAO than vegetarians. TMAO is associated with heart disease. If we pump mice full of carnitine, they also produce lots of TMAO and get heart disease. So humans should cut back on meat.
More teleoanalysis. It’s just another version of this argument, which helped to establish the Lipid Hypothesis: lard raises cholesterol, and rabbits get both high cholesterol and heart disease if they’re force-fed lard, so humans shouldn’t eat lard.
The only problem is that lard consumption was plummeting while heart-disease rates were skyrocketing.
The abstract also mentions the “well-established link” between meat consumption and heart disease. Since vegetarians are often more health-conscious in general and therefore less likely to consume sodas, donuts, candy and other junk, I’d expect them to have lower rates of heart disease than meat-eaters who consume the standard western (crap-filled) diet. But is that association consistent?
As I mentioned out in another post about yet another Meat Kills! study, here’s quote from a study titled Mortality In British Vegetarians:
The mortality of both the vegetarians and the nonvegetarians in this study is low compared with national rates. Within the study, mortality from circulatory diseases and all causes is not significantly different between vegetarians and meat eaters.
And here’s the conclusion from a study titled Dietary protein and risk of ischemic heart disease in women:
Our data do not support the hypothesis that a high protein intake increases the risk of ischemic heart disease. In contrast, our findings suggest that replacing carbohydrates with protein may be associated with a lower risk of ischemic heart disease.
In that study, the women who consumed the most protein ate 16.1% more red meat than women who consumed the least protein, but had lower rates of heart disease.
No consistency, no validity.
Enjoy your steak.
53 Comments »