Archive for the “Real Food” Category
Pardon the delay in posting and responding to comments. I was on Dauphin Island off the coast of Alabama for a wedding last week. I couldn’t ask The Older Brother to sit in, since it was his Middle Son getting married.
Anyway … in my last post, I commented briefly on a video presentation of a study that, in some people’s minds, nailed the coffin-lid shut on the Carb-Insulin hypothesis. I replied that I don’t believe the hypothesis is dead, but needs some revising. Based on personal experience, lots of reading and listening to podcasts, conversations with other people and so forth, I’ve been slowly revising it my own head for years. So let me reach up there between my ears and pluck out some thoughts, then see if I can work them into a coherent post.
More Carbohydrates => Higher Insulin => Fat Storage
That’s the Carbohydrate-Insulin hypothesis in a nutshell. The more carbohydrates you eat, the more insulin you produce, and the more insulin you produce, the fatter you become. Or to express it in reverse for those trying to lose weight: the fewer carbohydrates you eat, the less insulin you produce, and the less insulin you produce, the leaner you become.
Simple is certainly appealing. But I happen to know the linear equation of more carbs = more body fat isn’t true in my case.
But wait … didn’t you finally lose weight after going low-carb?!
Why, yes, I did. And it was easy. Unlike when I tried low-fat/low-calorie diets based on cereals, pasta, bread and rice, I dropped the pounds fairly quickly and wasn’t hungry. Like a lot of people, I figured if low is good, lower is better. So I stayed on a very-low-carb diet for a long time.
But after reading The Perfect Health Diet, I put real-food starches like potatoes and squashes back into my diet. After listening to Jimmy Moore’s podcast with the guys who designed the Carb Nite protocol, I started enjoying a high-carb Saturday night (but with a Mexican dinner, not donuts). After reading about the gut microbiome, I started eating tiger nuts for the fiber and resistant starch. After reading a book called Natural Hormone Enhancement, I decided to mix things up even more. Some days my diet resembles The Perfect Health Diet. Some days it resembles an Atkins induction diet, all meats and eggs and green vegetables. Some days I skip breakfast. Some days I fast until dinner. Saturday is still the high-carb night.
I average more carbs per day now than I did a few years ago, but haven’t gotten any fatter. So more carbs = more body fat clearly isn’t true for me, at least not as a linear relationship.
Does that mean insulin doesn’t drive fat accumulation? Nope, not at all. I don’t think we’ve seen the final word on the research, but let’s just say I’ll be stunned if turns out insulin has little to do with gaining weight.
Insulin inhibits lipolysis — the breakdown and release of fat from fat cells. Any book on metabolism will tell you so. That’s one of insulin’s many jobs, and it’s a crucial one. When you eat a meal that raises your blood sugar, insulin brings the blood sugar down partly by storing fat and keeping it stored. That way your cells burn the glucose first.
Take a look at this graph from a study by Dr. Jeff Volek. It shows the relationship between the concentration of insulin in our blood and the ability to release fat.
Here’s a quote from text accompanying the graph in the Volek paper:
Adipose tissue lipolysis is exquisitely sensitive to changes in insulin within the physiological range of concentrations. Small to moderate decreases in insulin can increase lipolysis several-fold, the response being virtually immediate. Insulin also stimulates lipogenesis [creating new body fat] by increasing glucose uptake and activating lipogenic and glycolytic enzymes. Small reductions in insulin levels, such as that easily achieved with dietary carbohydrate restriction, remove the normal inhibition on fat breakdown.
I’ve seen several studies in which giving diabetics higher concentrations of insulin made them fatter. In a study of the effects of obesity on rats, the researchers stated matter-of-factly that they made the rats obese by pumping them full of insulin. When they stopped pumping the rats full of insulin, the rats returned to their normal weights. So yes, high insulin levels encourage fat accumulation and inhibit fat breakdown. And yes, your body releases insulin when you eat carbs.
But it’s not the temporary spike in insulin after a meal that makes you fat. That’s when insulin is doing exactly what it’s supposed to do: partitioning nutrients, shuttling glucose into cells, storing fat so glucose is burned first when glucose is high, etc. Later, insulin is supposed to drop and allow fat to flow from the fat cells. Lower insulin also allows glycogen to be converted to glucose. It’s all about keeping glucose within a safe range.
In a lecture I watched online, a biochemist described insulin rising as the building/storing phase and insulin dropping as the burning/using phase. Both phases are necessary for good health. The problem is that for metabolically damaged people, insulin stays high when it ought to be low. They spend too much time in the building/storing phase, and not enough time in the burning/using phase.
During his presentation on hyperinsulinemia on the cruise, Dr. Ted Naiman showed a chart of the insulin responses of normal vs. obese/insulin-resistant people to the same meal. The obese people not only had a much higher initial insulin spike, their insulin levels stayed higher for several hours. Take another look at Dr. Volek’s graph. It doesn’t take much extra insulin to inhibit lipolysis rather dramatically.
But those are metabolically damaged people. (We’ll get to what I believe causes the damage shortly.) For metabolically healthy people, a high-carb meal will certainly raise insulin temporarily — as it should — but that doesn’t necessarily mean insulin will stay high. When I first started hearing from paleo types that tubers have been part of the human diet for eons and are perfectly fine foods, they usually pointed to the Kitavans – native people who live on a high-carb diet (mostly sweet potatoes), but aren’t fat or diabetic.
So I looked up some articles and a study of the Kitavans. Yup, they eat a lot of sweet potatoes and they’re not fat or diabetic. But here’s the interesting part: their average insulin level is 24 pmol/L. If you check Volek’s chart, you’ll see that’s down in the range where fat breakdown occurs. (By contrast, one study puts the average insulin level for American adults at around 60 pmol/L.) So for the Kitavans at least, a high-carb diet of whole unprocessed foods doesn’t lead to high insulin levels throughout the day. In other words, they don’t become insulin resistant. I’m sure we could find plenty of other paleo people who ate natural starches without becoming fat and diabetic. Quite a few Native Americans, for example, grew squashes and beans.
No doubt the potatoes and other starches I eat now temporarily spike my insulin. So why haven’t I gotten any fatter? Well, I don’t have any way of checking my fasting insulin level at home, but I’d wager a large sum it’s no higher now than it was a few years ago, when I rarely ate starch. I’d also wager a large sum that when I was living on low-fat cereals, low-fat pasta, whole-wheat bread with margarine and other vegetarian delights, my fasting insulin was much higher.
So the first revision of the “alternative hypothesis” I carried around in my head looked something like this:
Damaging Diet => Chronically High Insulin (Insulin Resistance) => Fat Storage.
What is or isn’t a damaging diet certainly varies among individuals. Back in this post, I recounted a section from Denise Minger’s excellent book Death By Food Pyramid in which she wrote about the huge variations in how much amylase we produce. People who produce little amylase experience much more dramatic blood-sugar surges when they consume starch than people who produce a lot of amylase. The low-amylase producers are also eight times as likely to become obese.
I doubt that’s a coincidence. Excess glucose damages cells. It makes sense that cells would protect themselves against high-glucose assaults by developing resistance to the insulin that’s trying to shove glucose through the door. So perhaps for some people, it really is as simple as too many carbs => insulin resistance.
But having said that, I doubt many type II diabetics got that way by eating potatoes and fruit. I think it’s much more likely that the carb culprit was processed carbs. It isn’t just that they spike blood sugar (although they certainly do). These “acellular” carbohydrates also produce inflammation – and inflammation is a likely driver of insulin resistance.
Which brings us to a major non-carb culprit: the crap oils that have been displacing natural fats in our diets for decades. We didn’t just start eating more breads and cereals after the Food Pyramid came around. We also started replacing butter and lard with soybean oil, cottonseed oil and other industrial horrors that drive inflammation. If inflammation in turn drives insulin resistance, then the “heart healthy” diets people started adopting in the 1980s were a double whammy: too many processed carbs, combined with industrial oils. Pass the (ahem) “whole wheat” toast with margarine, please, because I’m being good to my heart.
The second revision of the “alternative hypothesis” I carry around in my head took it from this:
Damaging Diet => Chronically High Insulin (Insulin Resistance) => Fat Storage.
Damaging Diet => Hormonal Disruption => Fat Storage.
Yes, insulin resistance is a form of hormonal disruption, and yes, I believe chronically high insulin drives fat accumulation. But other hormonal disruptions can make us fat too. I’ve mentioned seeing a documentary called The Science of Obesity that featured a woman who’d been lean her entire life, then started blowing up. She cut her calories to 1500 per day and still got fatter. Doctor after doctor accused her of lying about her diet.
But finally an endocrinologist ran some tests and found she had a small tumor on her brain. The tumor was preventing her brain from sensing the hormone leptin. Since leptin tells the brain how big our fat stores are, her brain concluded that she had no fat stores and needed to build them up. Fat stores are, after all, a crucial part of our fuel system. So each time she restricted her calories more, her body responded by slowing her metabolism more.
Few obese people have a brain tumor, but once again, a bad diet can lead to leptin resistance. Inflammation may cause leptin resistance directly, and chronically high insulin can block the leptin signal from reaching the brain. So we’re back to the same likely suspects: processed carbs and crap oils.
A baked potato with butter contains neither, which is one reason I now eat the occasional baked potato with butter. I may have surprised a few people on the low-carb cruise by eating the potato that came with my dinner on several nights. Then again, I saw others in our group doing likewise. Like I said, the low-carb movement is becoming more of a real-food movement, at least among the people I know.
But I don’t just eat the potato because I think I can get away with it. I eat the potato because I believe I’m better off with it than without it. Yup, you just heard me say that … er, write that.
Once again, the reason has to do with hormones. Going down to near-zero on the carbs and staying there can cause hormonal disruptions in some people. In the Natural Hormone Enhancement book I mentioned above, author Rob Faigin praised low-carb diets as a way to jump-start weight loss, but cautioned that going very-low-carb permanently can reduce testosterone and raise cortisol in men. He cited several studies to back up the point. Here’s one I just dug up.
He also cited evidence that going permanently low-carb can lead to a slower thyroid. I know Dr. Ron Rosedale insists the change in thyroid hormones is a healthy adaptation, but come on … if you’re trying to lose weight, do you really want a “healthy” slower thyroid?
Faigin’s solution is to mix it up: a VLC diet five days per week to promote weight loss, then high-carb (but not processed carbs) with reduced fat two days per week to prevent hormonal disruptions. The Carb Nite protocol is based on a similar idea. Paul Jaminet’s solution, of course, is to eat some “safe starches” daily while still keeping carbs on the lowish side overall. I can’t say if one solution is better than the other. It probably depends on the individual. Like I said, I mix things up and go with different diets on different days.
Having said all that, I would never encourage type II diabetics to run out and eat potatoes. During a Q & A session on a previous low-carb cruise, Denise Minger put it something like this: a low-carb diet is an effective treatment for type II diabetes, but that doesn’t mean metabolically healthy people have to give up fruit and potatoes to avoid diabetes. In other words, just because someone with a broken leg needs crutches, it doesn’t mean we must all use crutches to avoid breaking our legs. On the other hand, just because people can eat potatoes and fruit without becoming diabetic, it doesn’t mean diabetics should eat potatoes and fruit. In other words, just because walking without crutches won’t break your leg, it doesn’t mean people with broken legs don’t need crutches.
So to wrap up a very long post:
I don’t believe obesity is as simple as the more carbs we eat, the higher the insulin, and the higher the insulin, the more fat accumulation. Losing weight also isn’t as simple as the fewer carbs we eat, the lower the insulin, and the lower the insulin, the leaner we become. Cutting carbs can certainly promote weight loss (as it did for me), but when most of us go low-carb, we not only cut out the acellular processed carbs completely, we also embrace real fats and give up the crap oils. We eat bacon and fry our eggs in real butter. So I suspect the benefits are partly the result of reducing inflammation, which in turn reduces insulin resistance and perhaps leptin resistance.
To keep the benefits coming, it’s not necessary (or even advisable for many people) to stay at near-zero-carb levels permanently. For non-diabetics, I believe it’s better for overall hormonal health to mix it up, adding in some real-food starches, or cycling VLC days with higher-carb days.
To me, the golden nugget of the “alternative hypothesis” is that getting fat isn’t about calories; it’s about hormones. When our government told everyone to eat plenty of grains and cut the arterycloggingsaturatedfat!, following that advice created hormonal disruptions for many, many people. The cure is to 1) eat real, unprocessed food and 2) reduce the carbs to a level appropriate for your metabolism.
129 Comments »
Well, it’s sure been an eventful year in Illinois politics, what with the veto-proof Democratic legislature and the Republican governor putting together a surprise last-minute deal for an honest-to-goodness balanced budget that will get the 100+ billion pension debt paid down over the next ten years, AND address the unfunded state retiree health benefit obligations ($56 B), while knocking down the $5+ billion backlog of bills to vendors dating back over a year now, and simultaneously restoring state services to the indigent, and even finally opening our state museum and public parks again.
HA! HA! HA! HA! HA! HA! HA! HA! HA! HA! HA! HA! HA! HA! HA! HA! HA! HA! HA! HA!
Man, if you could see the look on your face! Sometimes, I just crack myself up.
Actually the unfunded pension liability rose over $6 billion last year to over $111 billion (in a record up market), retiree health beneficiaries are one year closer to insolvency, and state vendors (including social service NFP’s) are still registering red on the “How Screwed Are We?” meter, but at least according to the budget — …
Oh wait, there is no budget.
I don’t mean a budget for this year. I mean the fiscal year 2015 budget, that started July 1, 2015 and is ending in less than two months. They haven’t finished passing a budget for that. It’s not looking so good for 2016 either.
Not to worry — welfare checks and state worker checks (including the legislators who haven’t passed a law to pay anything) are still going out. Just not the ones for if you, say, sold the state some office supplies; or rent a building to them; or provide care to the mentally disabled. Little stuff like that.
You would be forgiven for thinking that our elected officials, who are demonstrably incapable of discharging even their most basic, simple tasks, are just absolutely useless. You couldn’t be more wrong — they’re much worse than useless.
They may not be able to do things like pass a budget and allocate funds for things like taking care of poor people, funding schools, building roads, and sundry other basics that even libertarians like me understand people now want government to do (not agree, of course, but understand); but that doesn’t mean they aren’t busy.
Sorry. I know I didn’t give you a “Politics!” trigger warning, but that’s not the real point of this post. Here’s the point:
As I confidently predicted here and reiterated here, the bureaucrats have completed their inevitable march to addressing one of the most dangerous health scourges facing our nation…
… yes, after three years, the $100,000 a year, state-employed lick-spittle turds who are being funded by the USDA to get raw milk out of the market apparently wore down the mom-and-pop operators who had to take time off (lose income) every time they (re-)proposed new regulations.
Remember kids — regulators never get you with brains, competence, or results. They always win by exhaustion.
As elaborated in my prior posts, they can’t just make raw milk illegal. When they want to take away something the Bigs (Ag, Pharma, Banking, or in this case Milk) don’t want to have to compete with, they just regulate you to death.
[Here’s the short version if you didn’t read those previous posts:
“after over a hundred people showed up to politely but loudly protest the state’s heavy-handed actions, I noted:
‘I’ve heard from a couple of folks who think the regulators got an education on raw milk… Maybe the bureaucrats would change things up substantially. Maybe even remove impediments to raw milk while setting a few common-sense protocols, as it fits in with the buy local/real foods programs the state and others talk up.’
Feeling I had a better understanding of bureaucratic sausage-making than those good, honest people, I ended with…
‘I’m guessing they’ll lay low for a few months or more, and then pass pretty much all of those rules as is, maybe without the 100 gallon limit. Or maybe they’ll bump the limit to 500 gallons. But they didn’t learn anything, and they’re there to pass those rules.’
It’s what they do.”]
The first posts were after a 2013 hearing. The followup was from 2014. Our betters had to lay in the weeds for over another year, but then they did exactly what I said they’d do. It’s like Gravity.
Right again. Dammit.
So starting in July, when I go to Linda’s farm — where I can always walk around and see the cows my milk comes from, and see the operation, and walk through the barn she milks in, there will be a few other things in place.
For my protection, of course.
Like, she’ll have to get a permit from the insolvent Illinois government. But first,she’ll have to complete an inspection by the incompetent Illinois government. She’ll have to take samples and pay for a lab to test the milk for a few weeks to get the permit, then do regular ongoing tests. Any day anyone buys milk, she’ll have to store a sample of the milk for two weeks. If the department doesn’t like the way her barn looks, they can shut her down until she makes it look nice to them and they re-inspect her. Getting an inspection rescheduled could be difficult as the state doesn’t have a budget, so they can’t hire more inspectors, and even if it did they don’t have any money to pay for more inspectors.
[They can also shut her down if one of her free-ranging egg chickens walks through the milk barn. Hey, it sounds harsh, but you have to be cautious about the whole “avian flu” thing that used to wipe out whole geographic areas of birds and spread disease until we started safely housing hundreds of thousands of chickens in legal, government approved and inspected warehouses; cutting their beaks off; and force feeding them antibiotics. Hmmm, I may have that backwards.]
Every time I buy a gallon of her delicious “creamy milk” (as The Grandkids call it), she’ll have to write my name, address, and phone number in a log that she has to keep for six months and make available to the egregiously misnamed Department of Public Health. She’ll have to have a placard up (in letters at least 2 inches high) that states:
“”Warning: Milk that is not pasteurized is sold or distributed here. This dairy farm is not inspected routinely by the Illinois Department of Public Health”
Wooooooo. Scary. It’s supposed to be, anyway.
Also, she’ll have to provide me with “Department-approved consumer awareness information.” It will say things like:
“”WARNING: This product has not been pasteurized and, therefore, may contain pathogens that cause serious illness, especially in children, the elderly, women who are pregnant and persons with weakened immune systems.”
Plus, it’s now illegal for any raw milk producer to sell yogurt or cheese made with their raw milk, even if they pasteurize it as part of the process. Wouldn’t want any of these folks being able to earn a value-added premium for their products.
One of the last items in the new reg states that the Department can suspend or revoke the dairy farm permit whenever:
“the Department has reason to believe that a public hazard exists”
So since “the Department” is being funded by the USDA, and the USDA’s position is that there is absolutely no such thing as a safe glass of raw milk, somewhere down the line, you can bet “the Department” will determine that they have reason to believe that anyone producing and selling raw milk constitutes a public hazard.
I’ll say it again,
“It’s what they do.”
I feel so much safer.
Tom should be back next week, hopefully with highlights of the Low Carb Cruise. Thanks for stopping by.
The Older Brother
26 Comments »
Hiya, Fat Heads!
Been awhile since I’ve got to sit in The Big Chair — trying to remember what all these buttons do.
As Tom mentioned, The Middle Son and his amazing girlfriend told The Wife and me a couple of months ago that they where going to get married. We were thrilled. Then they told us where they wanted to get married. Here’s a hint from this post from about a year ago:
“I’d been adamant for the last several years that I wasn’t coming back. Don’t get me wrong, I love it here. House facing the Gulf (we actually have two houses this time to accommodate all 15 people), The Wife and I doing most of the cooking, everyone else doing most of the cleaning, hanging out on the beach, watching the shrimp boats go out with the dolphins trolling behind them for the freebies that fall out of the nets.
It’s just that we’ve done it several times and I was done. I kept arguing that I didn’t want to have a one destination bucket list. This year, The Wife pointed out that this would be the first time The Grandkids would be able to come, too, and wouldn’t it be great to see them at the ocean for the first time.
n.b., folks — there’s no actual defense against that one.”
Yep. Back to Dauphin Island. Turns out there are other things besides “The Grandkids first time” that there’s no defense against. It’s becoming a family joke. One of the folks I work with suggested maybe I should look in to buying a burial plot down there, since that seems to be where I always end up anyway.
It will be a great and joyous time, and it’s coming up fast — the end of this month. Tom and Chareva and their girls are coming, lots of the rest of the family, a few good friends — around forty people or so at last count.
And I’m never going back. This time I mean it (Ha!).
As Tom also mentioned, my responsibilities in preparing for the occasion essentially consist of showing up. This is an approach I mastered early on, and every semester urge the young men in the Economics class where I am a guest speaker to adopt. The key, as I serendipitously discovered with The Wife (who was at the time The Fiancee), is to take a job about 700 miles away shortly after you’ve bamboozled your betrothed into accepting your proposal. So then you essentially can’t be involved in any of the decision-making for the wedding – photographer, venue, dresses, tuxes, food, entertainment, etc., etc., etc.
But, as I explain to them, “guess what — YOU DON’T GET TO MAKE ANY OF THOSE DECISIONS, ANYWAY, because it’s not your day. It’s hers!”
You get the exact same amount of decision-making power, but you don’t get dragged all over to various vendors, shops, and venues, and then have to give your opinion before being told the correct answer. You just have to fly in a couple of days ahead of the wedding, get your tux fitted, do the bachelor party, then show up for the wedding.
It’s a beautiful system. Pass it on.
Anyway, it’s to the point where Spring looks like it may stick around now, and I took a trip out to Linda’s farm last week and thought I’d share some pics. I’ve been dropping in once in awhile to get some eggs, but things just seemed to pop into full season this past week. Here’s the front pasture, really greening up now.
Linda and her sister Kim took the “pick up the old grocery store produce once in awhile and compost it” approach we were doing and really got serious about it. Here’s the current work area, which should be next year’s compost…
… and here’s part of this year’s compost from their efforts last season. There’s another three or four mounds this size off to the side. Black Gold!
Linda’s hedge trimmers/weed eaters have had their annual maintenance and are all primed up for the season.
Here’s Tartar, our cow who’s now given us our third calf after getting out of the “freezer” and into the “breeder” column by surprising us with her first calf a couple of winter ago.
Here’s this year’s calf. It’s a heifer and Linda named her “Tofu.” She got a name because I think we’re planning on keeping her as a breeder also. The Oldest Son has been wanting to get in on a share of a cow, and this will give us two breeders for four families (1/2 a cow each per year, hopefully) instead of three families splitting one cow a year.
Here’s last year’s bull, who will be heading to the freezer in late fall after getting to spend the Spring and Summer on pasture.
Linda’s second set of “bacon” is also coming along nicely.
After three months of maybe being able to get a couple of dozen eggs every other week or so, Linda’s egg layers are in full production mode. I’ve been getting 6 or more dozen a week, and she’s got other customers.
Our next batch of 100 day-old Freedom Ranger chicks arrived via Post Office the first week of April, so these guys have about another week in the coop/brooder until they get moved into the “tractors” on the pasture, where Linda moves them daily and they can get sunshine, organic feed, bugs, new grass and fresh water every day, and generally “express their chicken-ness” until mid-summer. Then The Oldest Son and I show up, bring the Whiz-Bang Chicken Plucker out of the barn, and start re-stocking the freezer.
Finally, we’re on the verge of being able to get real milk again. A couple of Linda’s milk cows calved recently, and will have “extra” pretty soon. This one should be having her calf any minute!
So, Spring is finally here and we’re looking forward to this year’s supply of beef, pork, chicken, eggs, and milk — knowing and respecting where every bite and drop came from.
The Older Brother
10 Comments »
I’m almost embarrassed by the number of people in cyberspace who refer to Fat Head Pizza. Yeah, it’s a delicious low-carb pizza crust that tastes like real pizza crust, but I had nothing to do it. I didn’t even write the post with the recipe. The Older Brother’s Oldest Son wrote it up after modifying a recipe he found at Cooky’s Creations, then the Older Brother posted it.
That post (which you can view here if you missed it) still draws views and comments — 359 of them at last count. It may be the most-viewed post on the entire blog … and like I said, I didn’t even write it.
Now Fat Head Pizza has re-purposed as Fat Head Crackers. My embarrassment continues. If I keep getting credit for things I didn’t create, I’ll have to start hanging out with Al Gore.
Anyway, the crackers version of the pizza crust appeared here on a recipe (and other stuff) site called Ditch The Carbs. When I was alerted to the crackers, I went to the site and poked around. There are lots of excellent no-sugar, no-grain recipes there (not attributed to Fat Head), so I thought I’d mention it.
And now, since it’s 66 degrees outside (a mere week after we were sledding down our back hill), I need to step outside for a round of disc golf. Enjoy your weekend.
57 Comments »
Duck Dodgers (who posts comments here now and then) wrote a long post on the Free the Animal blog titled How Wheat Went From Superfood To Liability.
Don’t worry; he’s not encouraging you to toddle down to the Olive Garden for a bowl of pasta and stop for some (ahem) “whole-wheat” bread on the way home. His point, as briefly as I can state it, is that ancient wheat was a nourishing food — which we turned into garbage through modern milling and refining.
I enjoy Duck’s Free the Animal guest posts because he fires arrows at the sacred cows of paleo and low-carb.
What?! You enjoy that?!
Yes, I do. We don’t learn in an echo chamber. We learn by being challenged, and by being willing to change our minds. At one time, I believed all the horsehocky about saturated fat clogging our arteries, red meat causing cancer, etc. I changed my mind because people challenged my beliefs. Thank goodness they did.
I encourage you to read the entire post. Go ahead, I’ll wait …
Okay, with that out of the way (and in case you didn’t read the post), I’ll pluck some quotes and add my own comments. As you’ll see, I think Duck makes some excellent points, but I’m still not persuaded ancient wheat was a superfood.
So, how did cultures regard wheat and whole grains before the industrial revolution? According to the historical literature, wheat was not some kind of sub-par caloric filler or cheap energy. Every culture had its superfood and wheat was, hands down, the superfood of Western civilization. Whole wheat is not just calories and nutrients. It contains of all sorts of phenolics, carotenoids, sterols, β-glucan, resistant starch, inulin, oligosaccharides, lignans, and other phytonutrients. Much of the health benefits of wheat are believed to come from these phytonutrients.
Economist Thomas Sowell once said that when his students declared this or that to be good or bad, his next question was: compared to what?
Duck makes a convincing case that ancient wheat was far better than the refined garbage people eat today. But was a wheat-based diet healthy compared to a hunter-gatherer diet?
Anthropologist Jared Diamond famously called the switch to agriculture the worst mistake in the history of the human race, based largely on observations of human remains. Some quotes from his article in Discover:
In some lucky situations, the paleopathologist has almost as much material to study as a pathologist today. For example, archaeologists in the Chilean deserts found well preserved mummies whose medical conditions at time of death could be determined by autopsy. And feces of long-dead Indians who lived in dry caves in Nevada remain sufficiently well preserved to be examined for hookworm and other parasites.
Usually the only human remains available for study are skeletons, but they permit a surprising number of deductions. To begin with, a skeleton reveals its owner’s sex, weight, and approximate age. In the few cases where there are many skeletons, one can construct mortality tables like the ones life insurance companies use to calculate expected life span and risk of death at any given age. Paleopathologists can also calculate growth rates by measuring bones of people of different ages, examine teeth for enamel defects (signs of childhood malnutrition), and recognize scars left on bones by anemia, tuberculosis, leprosy, and other diseases.
One straight forward example of what paleopathologists have learned from skeletons concerns historical changes in height. Skeletons from Greece and Turkey show that the average height of hunger-gatherers toward the end of the ice ages was a generous 5′ 9” for men, 5′ 5” for women. With the adoption of agriculture, height crashed, and by 3000 B. C. had reached a low of only 5′ 3” for men, 5′ for women.
At Dickson Mounds, located near the confluence of the Spoon and Illinois rivers, archaeologists have excavated some 800 skeletons that paint a picture of the health changes that occurred when a hunter-gatherer culture gave way to intensive maize farming around A. D. 1150. Studies by George Armelagos and his colleagues then at the University of Massachusetts show these early farmers paid a price for their new-found livelihood. Compared to the hunter-gatherers who preceded them, the farmers had a nearly 50 per cent increase in enamel defects indicative of malnutrition, a fourfold increase in iron-deficiency anemia (evidenced by a bone condition called porotic hyperostosis), a threefold rise in bone lesions reflecting infectious disease in general, and an increase in degenerative conditions of the spine, probably reflecting a lot of hard physical labor.
A six-inch crash in height, with a rise in dental defects and infectious diseases (bearing in mind that the Dickson Mounds natives were growing maize, not wheat). Other anthropologists have made similar observations. When we took up farming, our health declined.
To be clear, Diamond doesn’t argue that grains induced those problems directly. He writes that perhaps when humans became farmers, the crops squeezed out a more varied and nutrient-dense hunter-gatherer diet, leading to malnutrition. But it’s clear that switching from a hunter-gatherer diet to a grain-based agricultural diet didn’t make us taller or healthier. Quite the opposite.
Back to Duck’s post:
Hippocrates, the father of Western medicine, not only recommended bread as a health-promoting staple, but he was keenly interested in experimenting with different preparations of wheat.
If wheat was so deleterious, you’d think that Hippocrates would have noticed it and warned against its consumption instead of recommending it for the prevention of disease.
Hippocrates was not alone. Avicenna recommended bread as a key staple of the diet. Paracelsus believed that wheat had mystical properties, and Aristotle thought foods made from wheat suits our bodies best. And, what we see over and over again in the historical literature is that wheat was once considered to be the most nutritious and most important edible plant in the entire vegetable kingdom. Bread was known as the Staff of life—it was the de facto superfood for agriculturalists.
Setting aside the appeal to authority, I’d ask the Sowell question again: compared to what? If Hippocrates was getting good results with his patients by having them substitute wheat for pork and green vegetables, then I’d say he was onto something. But we don’t seem to have that information. Maybe the wheat replaced swill.
Much of Duck’s post quotes doctors from previous centuries who recommended wheat as a health food. Okay, fair enough. That’s interesting at the very least. But given how often established medical opinion has turned out to be wrong over the centuries, I wouldn’t consider it solid evidence that ancient wheat was a superfood and didn’t cause health problems.
In both of his Wheat Belly books, Dr. William Davis blames the gliadin portion of gluten for causing many, if not most, of what he considers to be wheat’s deleterious effects. The ability of gliadin to increase gut permeability has been well established in recent years and is not, as far as I know, controversial. (If you Google “gliadin intestinal permeability,” you can read from now until you retire.)
Duck’s main point in his post is that milling and refining wheat turned it into health-sapping garbage. I agree wholeheartedly. But unless ancient wheat didn’t contain gliadin or we were somehow protected against the effects on gut permeability, I suspect wheat has always had the ability to induce auto-immune reactions. Perhaps those reactions weren’t linked to wheat because everyone ate the stuff.
I’m reminded of something I read in The Emperor of All Maladies, a hefty book about the history of cancer: when a doctor first floated the idea that smoking causes lung cancer, the vast majority of other doctors and researchers scoffed. They continued scoffing for years. As the author (an oncologist) explains, it’s been historically difficult for doctors to accept that something causes a disease if 1) nearly everyone is exposed to it, and 2) most of them never develop the disease.
At one time, nearly everyone smoked. Doctors smoked. The banker smoked. Your neighbor smoked. Your in-laws smoked. It was considered normal behavior. Heck, everyone does it, and few of them develop cancer, so it can’t be the smoking. Move along, let’s find the real cause.
When reading that passage, I thought, Hmm, just like with wheat. Everyone eats wheat, so it can’t be bad for us.
At a dinner some years ago, a friend I hadn’t seen in ages asked why I was skipping the bread and pasta. When I told him, he was incredulous. What?! How can wheat possibly be bad for us? Almost everyone eats wheat! People have been eating wheat since biblical times!
Well, yes. But from what I remember of the Bible, healing the sick was one of the real crowd-pleasing portions of the Jesus show.
True, we’ve been eating wheat for as long as we’ve been civilized. We’ve also had diabetes, cancer, heart disease, psoriasis, asthma, arthritis and schizophrenia for as long as we’ve been civilized. Wheat may have caused or contributed to all of them – even if, as with smoking and lung cancer, no single one of those diseases afflicted most people.
Back to Duck:
In his book, Wheat Belly: Lose the Wheat, Lose the Weight, and Find Your Path Back to Health, Dr. William Davis claimed that modern hybrids of wheat are to blame for all modern health issues. However, this is not supported by the scientific literature—nor is it supported by France’s lower levels of chronic diseases despite considerably higher wheat intakes.
Ahh, those wacky French. Truth is, I’m not sure what to make of them. They’re twice as likely to smoke as Americans, but have lower rates of heart disease … yet I wouldn’t cite them as proof that smoking doesn’t cause heart disease. I suspect that the American diet of HFCS, refined flour and industrial seed oils creates a perfect storm for inducing disease, which the French avoid by shunning the HFCS and seed oils and embracing natural animal fats. They might still be better off without the wheat.
Or perhaps someday we’ll learn that the French are healthier than us because spending an hour with your mistress before heading home for dinner with the wife and kids prevents nearly all chronic diseases. Chareva disagrees with that hypothesis and offered evidence that anyone who tests it will end up sleeping in a chicken coop.
Duck’s hypothesis is more interesting, despite not involving mistresses:
By 1953, Newfoundland had enacted mandatory fortification of white flour. By 1954, Canada and a number of US states had enacted the Newfoundland Law. Southern states in particular were eager to enact the law, to reduce pellagra, that had become prevalent during the Great Depression. These states typically mandated fortification of flour, bread, pasta, rice and corn grits.
In 1983, the FDA significantly increased the mandated fortification levels—coinciding with the beginning of the obesity epidemic. 1994 was the first year that obesity and diabetes statistics were available for all 50 states. Notice a pattern?
Fortifying flour may have ended the deficiencies of the Great Depression, but it appears to have significantly worsened chronic diseases.
Furthermore, wheat flour fortification may explain the popularity of non-celiac gluten sensitivity we see today in fortified countries (it was extremely rare prior to fortification). As it turns out, iron fortificants have been shown to promote significant gastric distress, even at low doses and pathogenic gut profiles in developing countries. Non-celiac gluten sensitivity is virtually unheard of in unfortified countries, like France, which consume 40% more wheat than Americans.
That’s the most eye-opening section of the post as far as I’m concerned. Before reading the brief history that Duck cites here, it never occurred me to that fortifying grain could make it worse. If gliadin didn’t cause gut permeability back in the day (still a big IF in my book), that could be the explanation.
As far as modern wheat goes, I’ve said this before, and I’ll say it again: Norman Borlaug, who was awarded the Nobel Prize for his part in developing semi-dwarf wheat, was a good man. He set out to prevent mass starvation, and he succeeded. Given a choice between semi-dwarf wheat or watching my kids die of starvation, I’ll take the wheat every damned time.
That being said, I still believe semi-dwarf wheat is something those of us who aren’t starving should avoid. Duck makes a good case that milling, refining and fortifying wheat turned it into a health hazard. But the changes in semi-dwarf likely threw gasoline on that fire. Here’s a quote from Wheat Belly Total Health:
One important change that has emerged over the past 50 years, for example, is increased expression of a gene called Glia-α9, which yields a gliadin protein that is the most potent trigger for celiac disease. While the Glia-α9 gene was absent from most strains of wheat from the early 20th century, it is now present in nearly all modern varieties.
Now let’s mill it, refine it, and fortify it. Awesome.
Dr. Davis believes the change in the gliadin gene is the reason celiac disease has increased by 400% in the past 50 years — and that’s a genuine increase, by the way, not a case of better diagnosis. Researchers realized as much when they compared blood samples from 50 years ago to recent blood samples. The modern samples were four times as likely to contain antibodies triggered by celiac disease.
Duck, on the other hand, believes fortification is the likely culprit. It’s an interesting possibility.
Back to Duck:
Nor does Dr. David Perlmutter’s book, Grain Brain: The Surprising Truth about Wheat, Carbs, and Sugar–Your Brain’s Silent Killers, explain how humanity enjoyed its highest levels of intellectual achievement while largely eating wheat and other grains as staple foods—enjoying unprecedented population growth and longevity as well.
I can explain that one. In a previous post, I mentioned Conquests and Cultures, by Thomas Sowell. One of the book’s main points is that economic specialization is required for cultures to advance. If pretty much everyone has to hunt and gather food, there will be no pianos, printing presses, telescopes or steam engines. There’s no doubt that agriculture led to economic specialization, and thus civilization and intellectual achievement.
But that doesn’t prove eating grains had a positive or even a neutral effect on our brains. It simply means that in a civilization where farming allows most people to do something else, Mozart becomes a composer and Voltaire becomes a writer. In a paleo society, Mozart is the hunter who sings those amazing songs around the campfire, and Voltaire is the hunter whose clever stories amuse his pals during the long walks home from a hunt. They may have had genius IQs, but we’ll never know. We do know that human brains have, in fact, been shrinking since their peak size roughly 20,000 years ago.
Another point Sowell makes in Conquests and Cultures is that civilizations advance through cross-pollination of ideas, technologies and resources. Throughout history, cross-pollination was often the result of large-scale conquest. (Sowell doesn’t ignore or excuse the brutality of conquest, by the way.) Conquering an inhabited territory requires a large army (another example of economic specialization), which requires a large population, which requires agriculture.
In Europe and the Middle East, the “crop of conquest” was wheat. In the Western Hemisphere, it was maize that enabled the Aztecs and Mayans to build cities and raise armies large enough to establish empires. But again, that doesn’t prove the conquerors were healthier or smarter than the tribes they subjugated. It only proves that farming enabled them to raise and feed large armies.
Okay, time to wrap up. This is already a long post about a long post. To summarize:
Duck believes ancient wheat was a nutritious food, not a health hazard. Maybe, but I remain skeptical. Maybe ancient wheat was good, maybe it was neutral, maybe it was bad but not nearly as bad as the stuff sold today. I still think it’s likely wheat has been provoking auto-immune reactions in susceptible people since the dawn of civilization.
But whether wheat was good or bad back in ancient times, the refined and fortified garbage sold today is a health hazard. On that we totally agree. So unless you want to go out and find some ancient wheat (which Duck explains how to do in his post) and give it a try, my advice remains the same:
Don’t Eat Wheat.
132 Comments »
A couple of podcasters who interviewed me recently asked if I believe we’re at a tipping point. I do. I’m seeing a major shift in what the public at large considers a healthy diet, thanks largely to the Wisdom of Crowds effect. It seems that more and more people are rejecting the decades-old anti-fat message and embracing real food – fat and all.
I’ve sometimes wondered if I’m just experiencing the Red Toyota Effect, which works like this: While shopping for a car, you make up your mind that you want a red Toyota … and soon after, you start noticing them all over the place, which leads you to think, “Holy moly! Everyone’s buying red Toyotas all of a sudden!” In fact, the red Toyotas were always there. You’re just noticing them now because owning a red Toyota is on your mind.
Sure, I’ve got diet on my mind. I write about diet, I think often about diet, I hang out in social media sites where the subject is diet. But I don’t believe I’m experiencing the Red Toyota Effect. I think there’s a real shift happening out there.
For starters, I keep seeing more mainstream media articles declaring that – surprise! — saturated fat doesn’t cause heart disease after all. Here are some quotes from an article in the U.K. Telegraph with the headline No link found between saturated fat and heart disease:
For the health conscious reader who has been stoically swapping butter for margarine for years the next sentence could leave a bad taste in the mouth.
Scientists have discovered that saturated fat does not cause heart disease while so-called ‘healthy’ polyunsaturated fats do not prevent cardiovascular problems.
In contrast with decades old nutritional advice, researchers at Cambridge University have found that giving up fatty meat, cream or butter is unlikely to improve health.
They are calling for guidelines to be changed to reflect a growing body of evidence suggesting there is no overall association between saturated fat consumption and heart disease.
Earlier this month Dr James DiNicolantonio of Ithica College, New York, called for a new public health campaign to admit ‘we got it wrong.’ He claims carbohydrates and sugar are more responsible.
Admit we got it wrong …. Yeah, that would be awesome. Despite my optimism about a big shift within the public at large, I don’t expect a We Got It All Wrong announcement from the USDA anytime soon. They are, however, slooooowly backing away from some of the advice they’ve been handing down for the past 35 years. Here are some quotes from a Forbes article titled Fat Is Back: Time To Stop Limiting Dietary Fats, Experts Say:
The latest version of the Dietary Guidelines for Americans – the government-sanctioned recommendations about what we should and shouldn’t eat – will include a game-changing edit: There’s no longer going to be a recommended upper limit on total fat intake. This hasn’t gotten as much press as the other big change – that cholesterol will no longer be considered a “nutrient of concern,” meaning that we can now eat eggs without feeling guilty.
But as the authors of a new paper in the Journal of the American Medical Association point out, the true game-changer in the new recommendations is that we won’t have to worry so much about the total fat content of our food. And this makes a lot of sense, since in many ways, fats are much better for us than what they’ve typically been replaced with in low-fat diets – refined carbs and added sugars.
For people who lived through the low-fat/no-fat craze that started in the 80s, this is big news. The change in fats recommendations has been coming for some time now, as studies have consistently shown that low-fat diets are in no way the beacon they once seemed to be, and can in fact be quite unhealthy over the long-term.
The USDA (ahem) “experts” are willing to admit that cholesterol is no longer a “nutrient of concern,” but can’t quite bring themselves to say saturated fat is okay. However – and this is huge, since so many people get their dietary advice from registered dieticians – the Academy of Nutrition and Dietetics has already jumped ahead of the USDA. The organization’s official commentary on the latest USDA guidelines first praises the USDA for its efforts, then disputes much of what the USDA has to say.
Dr. Stan De Loach (who has been recommending a high-fat, real-food diet to patients in Mexico for years) summarized the points made by the Academy of Nutrition and Dietetics:
1. Cholesterol contained in food items is NO LONGER a nutrient of interest or concern. That is, limiting cholesterol (egg yolks, for example) in the food plan makes no sense because there is no trustworthy scientific evidence that it may produce negative or harmful effects on the human body or cardiovascular system.
2. NO scientific consensus or concrete scientific evidence exists that could justify the recommendation that the quantity of dietary salt (sodium) be limited. This long-standing recommendation to not consume salt freely has been overturned. Moreover, the Report mentions that probably and certainly “there are persons who are NOT consuming a SUFFICIENT amount of sodium.”
3. “Not a single study included in this revision of the dietary recommendations meant to prevent cardiovascular disease was able to identify saturated fat as an element in the diet that has an unfavorable or adverse association to cardiovascular disease.” The experts recommend de-emphasizing saturated fat as a nutrient of interest or concern.
4. The lipid/lipoproteins LDL and HDL are NOT appropriate nor adequate for use as markers of the impact of diet on the risks of cardiovascular disease, for example, in the scientific studies that attempt to measure diet’s impact on the risks for cardiovascular disease.
5. “The consumption of carbohydrates carries a GREATER risk for cardiovascular disease than that of saturated fats.”
6. “It is likely that the impact of carbohydrate consumption on the risks for cardiovascular diseases is positive (that is, their consumption INCREASES the risks).”
7. “Therefore, it seems to us that the scientific evidence summarized and synthesized by the Committee suggests that the most effective simplified recommendation to reduce the incidence of cardiac disease would be a simple reduction in the consumption of carbohydrates, replacing them with polyunsaturated fats.” Polyunsaturated fats tend to reduce the levels of cholesterol in the blood. Avocados, fish (tuna, trout, herring, salmon), some varieties of nuts (peanuts, walnuts, sunflower seeds, sesame), some mayonnaises, some salad dressings, olive oil, etc., contain polyunsaturated fats.
8. “The strongest scientific evidence indicates that a reduction in the consumption of added sugars (carbohydrates) will improve the health of the American public.”
Okay, ya can’t win ‘em all, at least not right away. The dieticians want carbs replaced with polyunsaturated fats. But this is still huge. Look at the basic message: Stop worrying about cholesterol, saturated fat and salt. Start focusing on reducing sugars and refined carbohydrates. If this keeps up, people will soon believe you can eat food that tastes good and still be healthy. Dr. Ornish must be terrified.
It isn’t just that people are no longer accusing saturated fat of a crime it didn’t commit, either. There’s also been a huge rise in the demand for quality food, food that hasn’t been processed into nutritional oblivion. Food manufacturers are wondering what the bleep happened and trying to adjust, as this article in Fortune magazine online explains:
Try this simple test. Say the following out loud: Artificial colors and flavors. Pesticides. Preservatives. High-fructose corn syrup. Growth hormones. Antibiotics. Gluten. Genetically modified organisms.
If any one of these terms raised a hair on the back of your neck, left a sour taste in your mouth, or made your lips purse with disdain, you are part of Big Food’s multibillion-dollar problem. In fact, you may even belong to a growing consumer class that has some of the world’s biggest and best-known companies scrambling to change their businesses.
“Their existence is being challenged,” says Edward Jones analyst Jack Russo of the major packaged-food companies. In some ways it’s a strange turn of events. The idea of “processing”—from ancient techniques of salting and curing to the modern arsenal of artificial preservatives—arose to make sure the food we ate didn’t make us sick. Today many fear that it’s the processed food itself that’s making us unhealthy.
It’s pretty simple what people want now: simplicity. Which translates, most of the time, to less: less of the ingredients they can’t actually picture in their head.
Steve Hughes, a former ConAgra executive who co-founded and now runs natural food company Boulder Brands, believes so much change is afoot that we won’t recognize the typical grocery store in five years. “I’ve been doing this for 37 years,” he says, “and this is the most dynamic, disruptive, and transformational time that I’ve seen in my career.”
So it’s definitely not the Red Toyota Effect. This change is real, and it’s coming to a Kroger near you. In fact, I recently found – for the first time ever – dry-roasted almonds in a Kroger where the only ingredients were almonds and salt. A sign above that section of the store bragged about the lack of additives in the several varieties of nuts, which you can buy in bulk.
As the Fortune magazine article explains:
Shoppers are still shopping, but they’re often turning to brands they believe can give them less of the ingredients they don’t want—and for the first time, they can find them in their local Safeway, Wegmans, or Wal-Mart. Kroger’s Simple Truth line of natural food grew to an astonishing $1.2 billion in annual sales in just two years.
The search for authenticity has led organic food sales to more than triple over the past decade and increase 11% last year alone to $35.9 billion, according to the Organic Trade Association. Data provider Spins found that sales of natural products across nearly every category are growing in mainstream retailers, while more than half of their conventional counterparts are in decline.
Perhaps more frightening for Big Food, shoppers are doing something else as well: They’re skipping the middle aisles altogether.
The war on fat is ending, with fat emerging as the victor. Cholesterol is no longer a “nutrient of concern.” The low-salt nonsense is being abandoned by doctors, nutritionists and even the CDC. Consumers are avoiding foods with ingredients they can’t pronounce, and Big Food is both scared and scrambling to adjust.
Yes, we’re at a tipping point. Let’s hope the nation tips right over into better health.
95 Comments »