Archive for the “Real Food” Category

I’ve mentioned this story a couple of times before, but given the topic of this post, it bears repeating:

The Older Brother and I had a conversation some years back as our dad was fading from Alzheimer’s.  The Older Brother noted that while our great-grandfather was sharp until nearly age 100, our grandmother developed Alzheimer’s in her 80s, and our dad had (in retrospect) started succumbing in his late 60s.  Seeing the progression, The Older Brother said, “Well, we’re screwed.”  (That’s the family-friendly version of his analysis.)

I replied that Alzheimer’s is probably a form of diabetes, not a genetic destiny.  We can avoid or delay it for decades by eating a good diet.

Turns out a good diet might even reverse the condition to an impressive degree.  Here’s part of the abstract of a 2014 pilot program published in the journal Aging:

This report describes a novel, comprehensive, and personalized therapeutic program that is based on the underlying pathogenesis of Alzheimer’s disease, and which involves multiple modalities designed to achieve metabolic enhancement for neurodegeneration (MEND). The first 10 patients who have utilized this program include patients with memory loss associated with Alzheimer’s disease (AD), amnestic mild cognitive impairment (aMCI), or subjective cognitive impairment (SCI). Nine of the 10 displayed subjective or objective improvement in cognition beginning within 3-6 months, with the one failure being a patient with very late stage AD.

Six of the patients had had to discontinue working or were struggling with their jobs at the time of presentation, and all were able to return to work or continue working with improved performance. Improvements have been sustained, and at this time the longest patient follow-up is two and one-half years from initial treatment, with sustained and marked improvement.

Sustained and marked improvement?  Six of 10 patients able to return to work?  Why wasn’t this all over the news?!  Perhaps because there’s no miracle drug involved.  The therapeutic program employed here was mostly about diet and other lifestyle changes.

The paper opens with a long discussion of the biology of Alzheimer’s and the history (not an impressive one) of drug therapies.  Let’s skip those and get into the therapies employed with these patients.  Here are two examples:

Patient One

A 67-year-old woman presented with two years of progressive memory loss. She held a demanding job that involved preparing analytical reports and traveling widely, but found herself no longer able to analyze data or prepare the reports, and therefore was forced to consider quitting her job. She noted that when she would read, by the time she reached the bottom of a page she would have to start at the top once again, since she was unable to remember the material she had just read.

She was no longer able to remember numbers, and had to write down even 4-digit numbers to remember them. She also began to have trouble navigating on the road: even on familiar roads, she would become lost trying to figure out where to enter or exit the road. She also noticed that she would mix up the names of her pets, and forget where the light switches were in her home of years.

Sounds a lot like my dad around the same age.  Long before we realized he was suffering from Alzheimer’s, my mom complained to me that my dad just wanted to vegetate in front of the TV at night and didn’t read anymore – which seemed odd, given that he used to devour books and could quote facts from them years after reading them.  His driving also became so erratic, we had to talk him into giving it up before he killed someone.  Later, of course, we realized he’d stopped reading because he couldn’t remember what he’d just read.

Here’s the therapy for Patient One:

As noted above, and following an extended discussion of the components of the therapeutic program, the patient began on some but not all of the system: (1) she eliminated all simple carbohydrates, leading to a weight loss of 20 pounds; (2) she eliminated gluten and processed food from her diet, and increased vegetables, fruits, and non-farmed fish; (3) in order to reduce stress, she began yoga, and ultimately became a yoga instructor; (4) as a second measure to reduce the stress of her job, she began to meditate for 20 minutes twice per day; [5] she took melatonin 0.5mg po qhs; (6) she increased her sleep from 4-5 hours per night to 7-8 hours per night; (7) she took methylcobalamin 1mg each day; (8) she took vitamin D3 2000IU each day; (9) she took fish oil 2000mg each day; (10) she took CoQ10 200mg each day; (11) she optimized her oral hygiene using an electric flosser and electric toothbrush; (12) following discussion with her primary care provider, she reinstated HRT (hormone replacement therapy) that had been discontinued following the WHI report in 2002; (13) she fasted for a minimum of 12 hours between dinner and breakfast, and for a minimum of three hours between dinner and bedtime; (14) she exercised for a minimum of 30 minutes, 4-6 days per week.

No simple carbs.  Ditch the gluten.  Exercising, some over-the-counter supplements, more sleep and more exercise.  Now here are the results:

She began System 1.0, and was able to adhere to some but not all of the protocol components. Nonetheless, after three months she noted that all of her symptoms had abated: she was able to navigate without problems, remember telephone numbers without difficulty, prepare reports and do all of her work without difficulty, read and retain information, and, overall, she became asymptomatic. She noted that her memory was now better than it had been in many years. On one occasion, she developed an acute viral illness, discontinued the program, and noticed a decline, which reversed when she reinstated the program. Two and one-half years later, now age 70, she remains asymptomatic and continues to work full-time.

Big Pharma, eat your hearts out.  No drug has come close to those results.

Let’s look at one more case history.  Here’s what the paper says about Patient Two:

A 69-year-old entrepreneur and professional man presented with 11 years of slowly progressive memory loss, which had accelerated over the past one or two years. In 2002, at the age of 58, he had been unable to recall the combination of the lock on his locker, and he felt that this was out of the ordinary for him…. He noted that he had progressive difficulty recognizing the faces at work (prosopagnosia), and had to have his assistants prompt him with the daily schedule. He also recalled an event during which he was several chapters into a book before he finally realized that it was a book he had read previously. In addition, he lost an ability he had had for most of his life: the ability to add columns of numbers rapidly in his head.

Here’s his therapy:

The patient began on the following parts of the overall therapeutic system: (1) he fasted for a minimum of three hours between dinner and bedtime, and for a minimum of 12 hours between dinner and breakfast; (2) he eliminated simple carbohydrates and processed foods from his diet; (3) he increased consumption of vegetables and fruits, and limited consumption of fish to non-farmed, and meat to occasional grass-fed beef or organic chicken; (4) he took probiotics; (5) he took coconut oil i tsp bid; (6) he exercised strenuously, swimming 3-4 times per week, cycling twice per week, and running once per week; (7) he took melatonin 0.5mg po qhs, and tried to sleep as close to 8 hours per night as his schedule would allow; (8) he took herbs Bacopa monniera 250mg, Ashwagandha 500mg, and turmeric 400mg each day; (9) he took methylcobalamin 1mg, methyltetrahydrofolate 0.8mg, and pyridoxine-5-phosphate 50mg each day; (10) he took citicoline 500mg po bid; (11) he took vitamin C 1g per day, vitamin D3 5000IU per day, vitamin E 400IU per day, CoQ10 200mg per day, Zn picolinate 50mg per day, and α-lipoic acid 100mg per day; (12) he took DHA (docosahexaenoic acid) 320mg and EPA (eicosapentaenoic acid) 180mg per day.

And his results:

He began on the therapeutic program, and after six months, his wife, co-workers, and he all noted improvement. He lost 10 pounds. He was able to recognize faces at work unlike before, was able to remember his daily schedule, and was able to function at work without difficulty. He was also noted to be quicker with his responses. His life-long ability to add columns of numbers rapidly in his head, which he had lost during his progressive cognitive decline, returned. His wife pointed out that, although he had clearly shown improvement, the more striking effect was that he had been accelerating in his decline over the prior year or two, and this had been completely halted.

Ditch the processed foods, eat real foods.  Exercise and get enough sleep.  Take some supplements to replace the nutrients that were plentiful in hunter-gatherer diets, but are missing in modern diets.  Next thing you know, the guy can add columns of numbers in his head again.

I think we’re seeing why Alzheimer’s was rare in hunter-gatherer societies.  It isn’t some harsh sentence handed down by fate or genes.  It’s a condition caused by (in many cases, anyway) the same garbage diet that makes people fat and diabetic.

So no, I don’t believe The Older Brother and I will succumb to the disease that caused our dad to fade away in front of our eyes.  I expect to be blogging and making wisecracks at age 97 … with The Older Brother sitting in when I need a vacation.

Share

Comments 75 Comments »

Pardon the delay in posting and responding to comments. I was on Dauphin Island off the coast of Alabama for a wedding last week. I couldn’t ask The Older Brother to sit in, since it was his Middle Son getting married.

Anyway … in my last post, I commented briefly on a video presentation of a study that, in some people’s minds, nailed the coffin-lid shut on the Carb-Insulin hypothesis. I replied that I don’t believe the hypothesis is dead, but needs some revising. Based on personal experience, lots of reading and listening to podcasts, conversations with other people and so forth, I’ve been slowly revising it my own head for years. So let me reach up there between my ears and pluck out some thoughts, then see if I can work them into a coherent post.

More Carbohydrates => Higher Insulin => Fat Storage

That’s the Carbohydrate-Insulin hypothesis in a nutshell. The more carbohydrates you eat, the more insulin you produce, and the more insulin you produce, the fatter you become. Or to express it in reverse for those trying to lose weight: the fewer carbohydrates you eat, the less insulin you produce, and the less insulin you produce, the leaner you become.

Simple is certainly appealing. But I happen to know the linear equation of more carbs = more body fat isn’t true in my case.

But wait … didn’t you finally lose weight after going low-carb?!

Why, yes, I did. And it was easy. Unlike when I tried low-fat/low-calorie diets based on cereals, pasta, bread and rice, I dropped the pounds fairly quickly and wasn’t hungry. Like a lot of people, I figured if low is good, lower is better.  So I stayed on a very-low-carb diet for a long time.

But after reading The Perfect Health Diet, I put real-food starches like potatoes and squashes back into my diet. After listening to Jimmy Moore’s podcast with the guys who designed the Carb Nite protocol, I started enjoying a high-carb Saturday night (but with a Mexican dinner, not donuts). After reading about the gut microbiome, I started eating tiger nuts for the fiber and resistant starch. After reading a book called Natural Hormone Enhancement, I decided to mix things up even more. Some days my diet resembles The Perfect Health Diet. Some days it resembles an Atkins induction diet, all meats and eggs and green vegetables. Some days I skip breakfast. Some days I fast until dinner. Saturday is still the high-carb night.

I average more carbs per day now than I did a few years ago, but haven’t gotten any fatter. So more carbs = more body fat clearly isn’t true for me, at least not as a linear relationship.

Does that mean insulin doesn’t drive fat accumulation? Nope, not at all. I don’t think we’ve seen the final word on the research, but let’s just say I’ll be stunned if turns out insulin has little to do with gaining weight.

Insulin inhibits lipolysis — the breakdown and release of fat from fat cells. Any book on metabolism will tell you so. That’s one of insulin’s many jobs, and it’s a crucial one. When you eat a meal that raises your blood sugar, insulin brings the blood sugar down partly by storing fat and keeping it stored. That way your cells burn the glucose first.

Take a look at this graph from a study by Dr. Jeff Volek. It shows the relationship between the concentration of insulin in our blood and the ability to release fat.

Here’s a quote from text accompanying the graph in the Volek paper:

Adipose tissue lipolysis is exquisitely sensitive to changes in insulin within the physiological range of concentrations. Small to moderate decreases in insulin can increase lipolysis several-fold, the response being virtually immediate. Insulin also stimulates lipogenesis [creating new body fat] by increasing glucose uptake and activating lipogenic and glycolytic enzymes. Small reductions in insulin levels, such as that easily achieved with dietary carbohydrate restriction, remove the normal inhibition on fat breakdown.

I’ve seen several studies in which giving diabetics higher concentrations of insulin made them fatter. In a study of the effects of obesity on rats, the researchers stated matter-of-factly that they made the rats obese by pumping them full of insulin. When they stopped pumping the rats full of insulin, the rats returned to their normal weights. So yes, high insulin levels encourage fat accumulation and inhibit fat breakdown. And yes, your body releases insulin when you eat carbs.

But it’s not the temporary spike in insulin after a meal that makes you fat. That’s when insulin is doing exactly what it’s supposed to do: partitioning nutrients, shuttling glucose into cells, storing fat so glucose is burned first when glucose is high, etc. Later, insulin is supposed to drop and allow fat to flow from the fat cells.  Lower insulin also allows glycogen to be converted to glucose.  It’s all about keeping glucose within a safe range.

In a lecture I watched online, a biochemist described insulin rising as the building/storing phase and insulin dropping as the burning/using phase. Both phases are necessary for good health. The problem is that for metabolically damaged people, insulin stays high when it ought to be low. They spend too much time in the building/storing phase, and not enough time in the burning/using phase.

During his presentation on hyperinsulinemia on the cruise, Dr. Ted Naiman showed a chart of the insulin responses of normal vs. obese/insulin-resistant people to the same meal. The obese people not only had a much higher initial insulin spike, their insulin levels stayed higher for several hours. Take another look at Dr. Volek’s graph. It doesn’t take much extra insulin to inhibit lipolysis rather dramatically.

But those are metabolically damaged people. (We’ll get to what I believe causes the damage shortly.) For metabolically healthy people, a high-carb meal will certainly raise insulin temporarily — as it should — but that doesn’t necessarily mean insulin will stay high. When I first started hearing from paleo types that tubers have been part of the human diet for eons and are perfectly fine foods, they usually pointed to the Kitavans – native people who live on a high-carb diet (mostly sweet potatoes), but aren’t fat or diabetic.

So I looked up some articles and a study of the Kitavans. Yup, they eat a lot of sweet potatoes and they’re not fat or diabetic. But here’s the interesting part: their average insulin level is 24 pmol/L. If you check Volek’s chart, you’ll see that’s down in the range where fat breakdown occurs. (By contrast, one study puts the average insulin level for American adults at around 60 pmol/L.)  So for the Kitavans at least, a high-carb diet of whole unprocessed foods doesn’t lead to high insulin levels throughout the day. In other words, they don’t become insulin resistant. I’m sure we could find plenty of other paleo people who ate natural starches without becoming fat and diabetic. Quite a few Native Americans, for example, grew squashes and beans.

No doubt the potatoes and other starches I eat now temporarily spike my insulin. So why haven’t I gotten any fatter? Well, I don’t have any way of checking my fasting insulin level at home, but I’d wager a large sum it’s no higher now than it was a few years ago, when I rarely ate starch. I’d also wager a large sum that when I was living on low-fat cereals, low-fat pasta, whole-wheat bread with margarine and other vegetarian delights, my fasting insulin was much higher.

So the first revision of the “alternative hypothesis” I carried around in my head looked something like this:

Damaging Diet => Chronically High Insulin (Insulin Resistance) => Fat Storage.

What is or isn’t a damaging diet certainly varies among individuals. Back in this post, I recounted a section from Denise Minger’s excellent book Death By Food Pyramid in which she wrote about the huge variations in how much amylase we produce. People who produce little amylase experience much more dramatic blood-sugar surges when they consume starch than people who produce a lot of amylase. The low-amylase producers are also eight times as likely to become obese.

I doubt that’s a coincidence. Excess glucose damages cells. It makes sense that cells would protect themselves against high-glucose assaults by developing resistance to the insulin that’s trying to shove glucose through the door. So perhaps for some people, it really is as simple as too many carbs => insulin resistance.

But having said that, I doubt many type II diabetics got that way by eating potatoes and fruit. I think it’s much more likely that the carb culprit was processed carbs. It isn’t just that they spike blood sugar (although they certainly do). These “acellular” carbohydrates also produce inflammation – and inflammation is a likely driver of insulin resistance.

Which brings us to a major non-carb culprit: the crap oils that have been displacing natural fats in our diets for decades. We didn’t just start eating more breads and cereals after the Food Pyramid came around. We also started replacing butter and lard with soybean oil, cottonseed oil and other industrial horrors that drive inflammation. If inflammation in turn drives insulin resistance, then the “heart healthy” diets people started adopting in the 1980s were a double whammy: too many processed carbs, combined with industrial oils. Pass the (ahem) “whole wheat” toast with margarine, please, because I’m being good to my heart.

The second revision of the “alternative hypothesis” I carry around in my head took it from this:

Damaging Diet => Chronically High Insulin (Insulin Resistance) => Fat Storage.

To this:

Damaging Diet => Hormonal Disruption => Fat Storage.

Yes, insulin resistance is a form of hormonal disruption, and yes, I believe chronically high insulin drives fat accumulation. But other hormonal disruptions can make us fat too. I’ve mentioned seeing a documentary called The Science of Obesity that featured a woman who’d been lean her entire life, then started blowing up. She cut her calories to 1500 per day and still got fatter. Doctor after doctor accused her of lying about her diet.

But finally an endocrinologist ran some tests and found she had a small tumor on her brain. The tumor was preventing her brain from sensing the hormone leptin. Since leptin tells the brain how big our fat stores are, her brain concluded that she had no fat stores and needed to build them up. Fat stores are, after all, a crucial part of our fuel system. So each time she restricted her calories more, her body responded by slowing her metabolism more.

Few obese people have a brain tumor, but once again, a bad diet can lead to leptin resistance. Inflammation may cause leptin resistance directly, and chronically high insulin can block the leptin signal from reaching the brain. So we’re back to the same likely suspects: processed carbs and crap oils.

A baked potato with butter contains neither, which is one reason I now eat the occasional baked potato with butter. I may have surprised a few people on the low-carb cruise by eating the potato that came with my dinner on several nights. Then again, I saw others in our group doing likewise. Like I said, the low-carb movement is becoming more of a real-food movement, at least among the people I know.

But I don’t just eat the potato because I think I can get away with it. I eat the potato because I believe I’m better off with it than without it. Yup, you just heard me say that … er, write that.

Once again, the reason has to do with hormones. Going down to near-zero on the carbs and staying there can cause hormonal disruptions in some people. In the Natural Hormone Enhancement book I mentioned above, author Rob Faigin praised low-carb diets as a way to jump-start weight loss, but cautioned that going very-low-carb permanently can reduce testosterone and raise cortisol in men. He cited several studies to back up the point. Here’s one I just dug up.

He also cited evidence that going permanently low-carb can lead to a slower thyroid. I know Dr. Ron Rosedale insists the change in thyroid hormones is a healthy adaptation, but come on … if you’re trying to lose weight, do you really want a “healthy” slower thyroid?

Faigin’s solution is to mix it up: a VLC diet five days per week to promote weight loss, then high-carb (but not processed carbs) with reduced fat two days per week to prevent hormonal disruptions. The Carb Nite protocol is based on a similar idea. Paul Jaminet’s solution, of course, is to eat some “safe starches” daily while still keeping carbs on the lowish side overall. I can’t say if one solution is better than the other. It probably depends on the individual. Like I said, I mix things up and go with different diets on different days.

Having said all that, I would never encourage type II diabetics to run out and eat potatoes. During a Q & A session on a previous low-carb cruise, Denise Minger put it something like this: a low-carb diet is an effective treatment for type II diabetes, but that doesn’t mean metabolically healthy people have to give up fruit and potatoes to avoid diabetes. In other words, just because someone with a broken leg needs crutches, it doesn’t mean we must all use crutches to avoid breaking our legs. On the other hand, just because people can eat potatoes and fruit without becoming diabetic, it doesn’t mean diabetics should eat potatoes and fruit. In other words, just because walking without crutches won’t break your leg, it doesn’t mean people with broken legs don’t need crutches.

So to wrap up a very long post:

I don’t believe obesity is as simple as the more carbs we eat, the higher the insulin, and the higher the insulin, the more fat accumulation. Losing weight also isn’t as simple as the fewer carbs we eat, the lower the insulin, and the lower the insulin, the leaner we become. Cutting carbs can certainly promote weight loss (as it did for me), but when most of us go low-carb, we not only cut out the acellular processed carbs completely, we also embrace real fats and give up the crap oils. We eat bacon and fry our eggs in real butter. So I suspect the benefits are partly the result of reducing inflammation, which in turn reduces insulin resistance and perhaps leptin resistance.

To keep the benefits coming, it’s not necessary (or even advisable for many people) to stay at near-zero-carb levels permanently. For non-diabetics, I believe it’s better for overall hormonal health to mix it up, adding in some real-food starches, or cycling VLC days with higher-carb days.

To me, the golden nugget of the “alternative hypothesis” is that getting fat isn’t about calories; it’s about hormones. When our government told everyone to eat plenty of grains and cut the arterycloggingsaturatedfat!, following that advice created hormonal disruptions for many, many people. The cure is to 1) eat real, unprocessed food and 2) reduce the carbs to a level appropriate for your metabolism.

Share

Comments 129 Comments »

Greetings Fatheads,

Well, it’s sure been an eventful year in Illinois politics, what with the veto-proof Democratic legislature and the Republican governor putting together a surprise last-minute deal for an honest-to-goodness balanced budget that will get the 100+ billion pension debt paid down over the next ten years, AND address the unfunded state retiree health benefit obligations ($56 B), while knocking down the $5+ billion backlog of bills to vendors dating back over a year now, and simultaneously restoring state services to the indigent, and even finally opening our state museum and public parks again.

PSYCH!

HA! HA! HA! HA! HA! HA! HA! HA! HA! HA! HA! HA! HA! HA! HA! HA! HA! HA! HA! HA!

Man, if you could see the look on your face! Sometimes, I just crack myself up.

Actually the unfunded pension liability rose over $6 billion last year to over $111 billion (in a record up market), retiree health beneficiaries are one year closer to insolvency, and state vendors (including social service NFP’s) are still registering red on the “How Screwed Are We?” meter, but at least according to the budget — …

Oh wait, there is no budget.

I don’t mean a budget for this year. I mean the fiscal year 2015 budget, that started July 1, 2015 and is ending in less than two months. They haven’t finished passing a budget for that. It’s not looking so good for 2016 either.

Not to worry — welfare checks and state worker checks (including the legislators who haven’t passed a law to pay anything) are still going out. Just not the ones for if you, say, sold the state some office supplies; or rent a building to them; or provide care to the mentally disabled. Little stuff like that.

You would be forgiven for thinking that our elected officials, who are demonstrably incapable of discharging even their most basic, simple tasks, are just absolutely useless. You couldn’t be more wrong — they’re much worse than useless.

They may not be able to do things like pass a budget and allocate funds for things like taking care of poor people, funding schools, building roads, and sundry other basics that even libertarians like me understand people now want government to do (not agree, of course, but understand); but that doesn’t mean they aren’t busy.

Sorry. I know I didn’t give you a “Politics!” trigger warning, but that’s not the real point of this post. Here’s the point:

As I confidently predicted here and reiterated here, the bureaucrats have completed their inevitable march to addressing one of the most dangerous health scourges facing our nation…

… yes, after three years, the $100,000 a year, state-employed lick-spittle turds who are being funded by the USDA to get raw milk out of the market apparently wore down the mom-and-pop operators who had to take time off (lose income) every time they (re-)proposed new regulations.

Remember kids — regulators never get you with brains, competence, or results. They always win by exhaustion.

As elaborated in my prior posts, they can’t just make raw milk illegal. When they want to take away something the Bigs (Ag, Pharma, Banking, or in this case Milk) don’t want to have to compete with, they just regulate you to death.

[Here’s the short version if you didn’t read those previous posts:

“after over a hundred people showed up to politely but loudly protest the state’s heavy-handed actions, I noted:

‘I’ve heard from a couple of folks who think the regulators got an education on raw milk… Maybe the bureaucrats would change things up substantially.  Maybe even remove impediments to raw milk while setting a few common-sense protocols, as it fits in with the buy local/real foods programs the state and others talk up.’

Feeling I had a better understanding of bureaucratic sausage-making than those good, honest people, I ended with…

‘I’m guessing they’ll lay low for a few months or more, and then pass pretty much all of those rules as is, maybe without the 100 gallon limit.  Or maybe they’ll bump the limit to 500 gallons.  But they didn’t learn anything, and they’re there to pass those rules.’

It’s what they do.]

The first posts were after a 2013 hearing. The followup was from 2014. Our betters had to lay in the weeds for over another year, but then they did exactly what I said they’d do. It’s like Gravity.

Right again. Dammit.

So starting in July, when I go to Linda’s farm — where I can always walk around and see the cows my milk comes from, and see the operation, and walk through the barn she milks in, there will be a few other things in place.

For my protection, of course.

Like, she’ll have to get a permit from the insolvent Illinois government. But first,she’ll have to complete an inspection by the incompetent Illinois government. She’ll have to take samples and pay for a lab to test the milk for a few weeks to get the permit, then do regular ongoing tests. Any day anyone buys milk, she’ll have to store a sample of the milk for two weeks. If the department doesn’t like the way her barn looks, they can shut her down until she makes it look nice to them and they re-inspect her. Getting an inspection rescheduled could be difficult as the state doesn’t have a budget, so they can’t hire more inspectors, and even if it did they don’t have any money to pay for more inspectors.

[They can also shut her down if one of her free-ranging egg chickens walks through the milk barn. Hey, it sounds harsh, but you have to be cautious about  the whole “avian flu” thing that used to wipe out whole geographic areas of birds and spread disease until we started safely housing hundreds of thousands of chickens in legal, government approved and inspected warehouses; cutting their beaks off; and force feeding them antibiotics. Hmmm, I may have that backwards.]

Every time I buy a gallon of her delicious “creamy milk” (as The Grandkids call it), she’ll have to write my name, address, and phone number in a log that she has to keep for six months and make available to the egregiously misnamed Department of Public Health. She’ll have to have a placard up (in letters at least 2 inches high) that states:

“”Warning: Milk that is not pasteurized is sold or distributed here. This dairy farm is not inspected routinely by the Illinois Department of Public Health”

Wooooooo. Scary. It’s supposed to be, anyway.

Also, she’ll have to provide me with “Department-approved consumer awareness information.” It will say things like:

“”WARNING: This product has not been pasteurized and, therefore, may contain pathogens that cause serious illness, especially in children, the elderly, women who are pregnant and persons with weakened immune systems.”

Plus, it’s now illegal for any raw milk producer to sell yogurt or cheese made with their raw milk, even if they pasteurize it as part of the process. Wouldn’t want any of these folks being able to earn a value-added premium for their products.

One of the last items in the new reg states that the Department can suspend or revoke the dairy farm permit whenever:

“the Department has reason to believe that a public hazard exists”

So since “the Department” is being funded by the USDA, and the USDA’s position is that there is absolutely no such thing as a safe glass of raw milk, somewhere down the line, you can bet “the Department” will determine that they have reason to believe that anyone producing and selling raw milk constitutes a public hazard.

I’ll say it again,

“It’s what they do.”

I feel so much safer.

Tom should be back next week, hopefully with highlights of the Low Carb Cruise. Thanks for stopping by.

Cheers!

The Older Brother

Share

Comments 26 Comments »

Hiya, Fat Heads!

Been awhile since I’ve got to sit in The Big Chair — trying to remember what all these buttons do.

As Tom mentioned, The Middle Son and his amazing girlfriend told The Wife and me a couple of months ago that they where going to get married. We were thrilled. Then they told us where they wanted to get married. Here’s a hint from this post from about a year ago:

I’d been adamant for the last several years that I wasn’t coming back. Don’t get me wrong, I love it here. House facing the Gulf (we actually have two houses this time to accommodate all 15 people), The Wife and I doing most of the cooking, everyone else doing most of the cleaning, hanging out on the beach, watching the shrimp boats go out with the dolphins trolling behind them for the freebies that fall out of the nets.

It’s just that we’ve done it several times and I was done. I kept arguing that I didn’t want to have a one destination bucket list. This year, The Wife pointed out that this would be the first time The Grandkids would be able to come, too, and wouldn’t it be great to see them at the ocean for the first time.

n.b., folks — there’s no actual defense against that one.

Yep. Back to Dauphin Island. Turns out there are other things besides “The Grandkids first time” that there’s no defense against. It’s becoming a family joke. One of the folks I work with suggested maybe I should look in to buying a burial plot down there, since that seems to be where I always end up anyway.

It will be a great and joyous time, and it’s coming up fast — the end of this month. Tom and Chareva and their girls are coming, lots of the rest of the family, a few good friends — around forty people or so at last count.

And I’m never going back. This time I mean it (Ha!).

As Tom also mentioned, my responsibilities in preparing for the occasion essentially consist of showing up. This is an approach I mastered early on, and every semester urge the young men in the Economics class where I am a guest speaker to adopt. The key, as I serendipitously discovered with The Wife (who was at the time The Fiancee), is to take a job about 700 miles away shortly after you’ve bamboozled your betrothed into accepting your proposal. So then you essentially can’t be involved in any of the decision-making for the wedding – photographer, venue, dresses, tuxes, food, entertainment, etc., etc., etc.

But, as I explain to them, “guess what — YOU DON’T GET TO MAKE ANY OF THOSE DECISIONS, ANYWAY, because it’s not your day. It’s hers!”

You get the exact same amount of decision-making power, but you don’t get dragged all over to various vendors, shops, and venues, and then have to give your opinion before being told the correct answer. You just have to fly in a couple of days ahead of the wedding, get your tux fitted, do the bachelor party, then show up for the wedding.

It’s a beautiful system. Pass it on.

Anyway, it’s to the point where Spring looks like it may stick around now, and I took a trip out to Linda’s farm last week and thought I’d share some pics. I’ve been dropping in once in awhile to get some eggs, but things just seemed to pop into full season this past week. Here’s the front pasture, really greening up now.

Linda and her sister Kim took the “pick up the old grocery store produce once in awhile and compost it” approach we were doing and really got serious about it. Here’s the current work area, which should be next year’s compost…

… and here’s part of this year’s compost from their efforts last season. There’s another three or four mounds this size off to the side. Black Gold!

Linda’s hedge trimmers/weed eaters have had their annual maintenance and are all primed up for the season.

Here’s Tartar, our cow who’s now given us our third calf after getting out of the “freezer” and into the “breeder” column by surprising us with her first calf a couple of winter ago.

Here’s this year’s calf. It’s a heifer and Linda named her “Tofu.” She got a name because I think we’re planning on keeping her as a breeder also. The Oldest Son has been wanting to get in on a share of a cow, and this will give us two breeders for four families (1/2 a cow each per year, hopefully) instead of three families splitting one cow a year.

Here’s last year’s bull, who will be heading to the freezer in late fall after getting to spend the Spring and Summer on pasture.

Linda’s second set of “bacon” is also coming along nicely.

After three months of maybe being able to get a couple of dozen eggs every other week or so, Linda’s egg layers are in full production mode. I’ve been getting 6 or more dozen a week, and she’s got other customers.

Our next batch of 100 day-old Freedom Ranger chicks arrived via Post Office the first week of April, so these guys have about another week in the coop/brooder until they get moved into the “tractors” on the pasture, where Linda moves them daily and they can get sunshine, organic feed, bugs, new grass and fresh water every day, and generally “express their chicken-ness” until mid-summer. Then The Oldest Son and I show up, bring the Whiz-Bang Chicken Plucker out of the barn, and start re-stocking the freezer.

Finally, we’re on the verge of being able to get real milk again. A couple of Linda’s milk cows calved recently, and will have “extra” pretty soon. This one should be having her calf any minute!

So, Spring is finally here and we’re looking forward to this year’s supply of beef, pork, chicken, eggs, and milk — knowing and respecting where every bite and drop came from.

Cheers!

The Older Brother

Share

Comments 10 Comments »

I’m almost embarrassed by the number of people in cyberspace who refer to Fat Head Pizza.  Yeah, it’s a delicious low-carb pizza crust that tastes like real pizza crust, but I had nothing to do it.  I didn’t even write the post with the recipe.  The Older Brother’s Oldest Son wrote it up after modifying a recipe he found at Cooky’s Creations, then the Older Brother posted it.

That post (which you can view here if you missed it) still draws views and comments — 359 of them at last count.  It may be the most-viewed post on the entire blog … and like I said, I didn’t even write it.

Now Fat Head Pizza has re-purposed as Fat Head Crackers.  My embarrassment continues.  If I keep getting credit for things I didn’t create, I’ll have to start hanging out with Al Gore.

Anyway, the crackers version of the pizza crust appeared here on a recipe (and other stuff) site called Ditch The Carbs.  When I was alerted to the crackers, I went to the site and poked around.  There are lots of excellent no-sugar, no-grain recipes there (not attributed to Fat Head), so I thought I’d mention it.

And now, since it’s 66 degrees outside (a mere week after we were sledding down our back hill), I need to step outside for a round of disc golf.  Enjoy your weekend.

Share

Comments 57 Comments »

Duck Dodgers (who posts comments here now and then) wrote a long post on the Free the Animal blog titled How Wheat Went From Superfood To Liability.

Don’t worry; he’s not encouraging you to toddle down to the Olive Garden for a bowl of pasta and stop for some (ahem) “whole-wheat” bread on the way home. His point, as briefly as I can state it, is that ancient wheat was a nourishing food — which we turned into garbage through modern milling and refining.

I enjoy Duck’s Free the Animal guest posts because he fires arrows at the sacred cows of paleo and low-carb.

What?! You enjoy that?!

Yes, I do. We don’t learn in an echo chamber. We learn by being challenged, and by being willing to change our minds. At one time, I believed all the horsehocky about saturated fat clogging our arteries, red meat causing cancer, etc. I changed my mind because people challenged my beliefs. Thank goodness they did.

I encourage you to read the entire post. Go ahead, I’ll wait …

Okay, with that out of the way (and in case you didn’t read the post), I’ll pluck some quotes and add my own comments. As you’ll see, I think Duck makes some excellent points, but I’m still not persuaded ancient wheat was a superfood.

So, how did cultures regard wheat and whole grains before the industrial revolution? According to the historical literature, wheat was not some kind of sub-par caloric filler or cheap energy. Every culture had its superfood and wheat was, hands down, the superfood of Western civilization. Whole wheat is not just calories and nutrients. It contains of all sorts of phenolics, carotenoids, sterols, β-glucan, resistant starch, inulin, oligosaccharides, lignans, and other phytonutrients. Much of the health benefits of wheat are believed to come from these phytonutrients.

Economist Thomas Sowell once said that when his students declared this or that to be good or bad, his next question was: compared to what?

Duck makes a convincing case that ancient wheat was far better than the refined garbage people eat today. But was a wheat-based diet healthy compared to a hunter-gatherer diet?

Anthropologist Jared Diamond famously called the switch to agriculture the worst mistake in the history of the human race, based largely on observations of human remains.  Some quotes from his article in Discover:

In some lucky situations, the paleopathologist has almost as much material to study as a pathologist today. For example, archaeologists in the Chilean deserts found well preserved mummies whose medical conditions at time of death could be determined by autopsy. And feces of long-dead Indians who lived in dry caves in Nevada remain sufficiently well preserved to be examined for hookworm and other parasites.

Usually the only human remains available for study are skeletons, but they permit a surprising number of deductions. To begin with, a skeleton reveals its owner’s sex, weight, and approximate age. In the few cases where there are many skeletons, one can construct mortality tables like the ones life insurance companies use to calculate expected life span and risk of death at any given age. Paleopathologists can also calculate growth rates by measuring bones of people of different ages, examine teeth for enamel defects (signs of childhood malnutrition), and recognize scars left on bones by anemia, tuberculosis, leprosy, and other diseases.

One straight forward example of what paleopathologists have learned from skeletons concerns historical changes in height. Skeletons from Greece and Turkey show that the average height of hunger-gatherers toward the end of the ice ages was a generous 5′ 9” for men, 5′ 5” for women. With the adoption of agriculture, height crashed, and by 3000 B. C. had reached a low of only 5′ 3” for men, 5′ for women.

At Dickson Mounds, located near the confluence of the Spoon and Illinois rivers, archaeologists have excavated some 800 skeletons that paint a picture of the health changes that occurred when a hunter-gatherer culture gave way to intensive maize farming around A. D. 1150. Studies by George Armelagos and his colleagues then at the University of Massachusetts show these early farmers paid a price for their new-found livelihood. Compared to the hunter-gatherers who preceded them, the farmers had a nearly 50 per cent increase in enamel defects indicative of malnutrition, a fourfold increase in iron-deficiency anemia (evidenced by a bone condition called porotic hyperostosis), a threefold rise in bone lesions reflecting infectious disease in general, and an increase in degenerative conditions of the spine, probably reflecting a lot of hard physical labor.

A six-inch crash in height, with a rise in dental defects and infectious diseases (bearing in mind that the Dickson Mounds natives were growing maize, not wheat).  Other anthropologists have made similar observations.  When we took up farming, our health declined.

To be clear, Diamond doesn’t argue that grains induced those problems directly. He writes that perhaps when humans became farmers, the crops squeezed out a more varied and nutrient-dense hunter-gatherer diet, leading to malnutrition.  But it’s clear that switching from a hunter-gatherer diet to a grain-based agricultural diet didn’t make us taller or healthier. Quite the opposite.

Back to Duck’s post:

Hippocrates, the father of Western medicine, not only recommended bread as a health-promoting staple, but he was keenly interested in experimenting with different preparations of wheat.

If wheat was so deleterious, you’d think that Hippocrates would have noticed it and warned against its consumption instead of recommending it for the prevention of disease.

Hippocrates was not alone. Avicenna recommended bread as a key staple of the diet. Paracelsus believed that wheat had mystical properties, and Aristotle thought foods made from wheat suits our bodies best. And, what we see over and over again in the historical literature is that wheat was once considered to be the most nutritious and most important edible plant in the entire vegetable kingdom. Bread was known as the Staff of life—it was the de facto superfood for agriculturalists.

Setting aside the appeal to authority, I’d ask the Sowell question again: compared to what? If Hippocrates was getting good results with his patients by having them substitute wheat for pork and green vegetables, then I’d say he was onto something. But we don’t seem to have that information. Maybe the wheat replaced swill.

Much of Duck’s post quotes doctors from previous centuries who recommended wheat as a health food. Okay, fair enough. That’s interesting at the very least.  But given how often established medical opinion has turned out to be wrong over the centuries, I wouldn’t consider it solid evidence that ancient wheat was a superfood and didn’t cause health problems.

In both of his Wheat Belly books, Dr. William Davis blames the gliadin portion of gluten for causing many, if not most, of what he considers to be wheat’s deleterious effects. The ability of gliadin to increase gut permeability has been well established in recent years and is not, as far as I know, controversial. (If you Google “gliadin intestinal permeability,” you can read from now until you retire.)

Duck’s main point in his post is that milling and refining wheat turned it into health-sapping garbage. I agree wholeheartedly. But unless ancient wheat didn’t contain gliadin or we were somehow protected against the effects on gut permeability, I suspect wheat has always had the ability to induce auto-immune reactions. Perhaps those reactions weren’t linked to wheat because everyone ate the stuff.

I’m reminded of something I read in The Emperor of All Maladies, a hefty book about the history of cancer: when a doctor first floated the idea that smoking causes lung cancer, the vast majority of other doctors and researchers scoffed. They continued scoffing for years.  As the author (an oncologist) explains, it’s been historically difficult for doctors to accept that something causes a disease if 1) nearly everyone is exposed to it, and 2) most of them never develop the disease.

At one time, nearly everyone smoked. Doctors smoked. The banker smoked.  Your neighbor smoked.  Your in-laws smoked.  It was considered normal behavior. Heck, everyone does it, and few of them develop cancer, so it can’t be the smoking. Move along, let’s find the real cause.

When reading that passage, I thought, Hmm, just like with wheat. Everyone eats wheat, so it can’t be bad for us.

At a dinner some years ago, a friend I hadn’t seen in ages asked why I was skipping the bread and pasta. When I told him, he was incredulous. What?! How can wheat possibly be bad for us? Almost everyone eats wheat! People have been eating wheat since biblical times!

Well, yes. But from what I remember of the Bible, healing the sick was one of the real crowd-pleasing portions of the Jesus show.

True, we’ve been eating wheat for as long as we’ve been civilized. We’ve also had diabetes, cancer, heart disease, psoriasis, asthma, arthritis and schizophrenia for as long as we’ve been civilized. Wheat may have caused or contributed to all of them – even if, as with smoking and lung cancer, no single one of those diseases afflicted most people.

Back to Duck:

In his book, Wheat Belly: Lose the Wheat, Lose the Weight, and Find Your Path Back to Health, Dr. William Davis claimed that modern hybrids of wheat are to blame for all modern health issues. However, this is not supported by the scientific literature—nor is it supported by France’s lower levels of chronic diseases despite considerably higher wheat intakes.

Ahh, those wacky French. Truth is, I’m not sure what to make of them. They’re twice as likely to smoke as Americans, but have lower rates of heart disease … yet I wouldn’t cite them as proof that smoking doesn’t cause heart disease. I suspect that the American diet of HFCS, refined flour and industrial seed oils creates a perfect storm for inducing disease, which the French avoid by shunning the HFCS and seed oils and embracing natural animal fats.  They might still be better off without the wheat.

Or perhaps someday we’ll learn that the French are healthier than us because spending an hour with your mistress before heading home for dinner with the wife and kids prevents nearly all chronic diseases. Chareva disagrees with that hypothesis and offered evidence that anyone who tests it will end up sleeping in a chicken coop.

Duck’s hypothesis is more interesting, despite not involving mistresses:

By 1953, Newfoundland had enacted mandatory fortification of white flour. By 1954, Canada and a number of US states had enacted the Newfoundland Law. Southern states in particular were eager to enact the law, to reduce pellagra, that had become prevalent during the Great Depression. These states typically mandated fortification of flour, bread, pasta, rice and corn grits.

In 1983, the FDA significantly increased the mandated fortification levels—coinciding with the beginning of the obesity epidemic. 1994 was the first year that obesity and diabetes statistics were available for all 50 states. Notice a pattern?

Fortifying flour may have ended the deficiencies of the Great Depression, but it appears to have significantly worsened chronic diseases.

Furthermore, wheat flour fortification may explain the popularity of non-celiac gluten sensitivity we see today in fortified countries (it was extremely rare prior to fortification). As it turns out, iron fortificants have been shown to promote significant gastric distress, even at low doses and pathogenic gut profiles in developing countries. Non-celiac gluten sensitivity is virtually unheard of in unfortified countries, like France, which consume 40% more wheat than Americans.

That’s the most eye-opening section of the post as far as I’m concerned. Before reading the brief history that Duck cites here, it never occurred me to that fortifying grain could make it worse. If gliadin didn’t cause gut permeability back in the day (still a big IF in my book), that could be the explanation.

As far as modern wheat goes, I’ve said this before, and I’ll say it again: Norman Borlaug, who was awarded the Nobel Prize for his part in developing semi-dwarf wheat, was a good man.  He set out to prevent mass starvation, and he succeeded. Given a choice between semi-dwarf wheat or watching my kids die of starvation, I’ll take the wheat every damned time.

That being said, I still believe semi-dwarf wheat is something those of us who aren’t starving should avoid. Duck makes a good case that milling, refining and fortifying wheat turned it into a health hazard. But the changes in semi-dwarf likely threw gasoline on that fire. Here’s a quote from Wheat Belly Total Health:

One important change that has emerged over the past 50 years, for example, is increased expression of a gene called Glia-α9, which yields a gliadin protein that is the most potent trigger for celiac disease. While the Glia-α9 gene was absent from most strains of wheat from the early 20th century, it is now present in nearly all modern varieties.

Now let’s mill it, refine it, and fortify it. Awesome.

Dr. Davis believes the change in the gliadin gene is the reason celiac disease has increased by 400% in the past 50 years — and that’s a genuine increase, by the way, not a case of better diagnosis.  Researchers realized as much when they compared blood samples from 50 years ago to recent blood samples.  The modern samples were four times as likely to contain antibodies triggered by celiac disease.

Duck, on the other hand, believes fortification is the likely culprit.  It’s an interesting possibility.

Back to Duck:

Nor does Dr. David Perlmutter’s book, Grain Brain: The Surprising Truth about Wheat, Carbs, and Sugar–Your Brain’s Silent Killers, explain how humanity enjoyed its highest levels of intellectual achievement while largely eating wheat and other grains as staple foods—enjoying unprecedented population growth and longevity as well.

I can explain that one. In a previous post, I mentioned Conquests and Cultures, by Thomas Sowell. One of the book’s main points is that economic specialization is required for cultures to advance. If pretty much everyone has to hunt and gather food, there will be no pianos, printing presses, telescopes or steam engines. There’s no doubt that agriculture led to economic specialization, and thus civilization and intellectual achievement.

But that doesn’t prove eating grains had a positive or even a neutral effect on our brains. It simply means that in a civilization where farming allows most people to do something else, Mozart becomes a composer and Voltaire becomes a writer. In a paleo society, Mozart is the hunter who sings those amazing songs around the campfire, and Voltaire is the hunter whose clever stories amuse his pals during the long walks home from a hunt. They may have had genius IQs, but we’ll never know. We do know that human brains have, in fact, been shrinking since their peak size roughly 20,000 years ago.

Another point Sowell makes in Conquests and Cultures is that civilizations advance through cross-pollination of ideas, technologies and resources. Throughout history, cross-pollination was often the result of large-scale conquest. (Sowell doesn’t ignore or excuse the brutality of conquest, by the way.)  Conquering an inhabited territory requires a large army (another example of economic specialization), which requires a large population, which requires agriculture.

In Europe and the Middle East, the “crop of conquest” was wheat. In the Western Hemisphere, it was maize that enabled the Aztecs and Mayans to build cities and raise armies large enough to establish empires. But again, that doesn’t prove the conquerors were healthier or smarter than the tribes they subjugated. It only proves that farming enabled them to raise and feed large armies.

Okay, time to wrap up. This is already a long post about a long post. To summarize:

Duck believes ancient wheat was a nutritious food, not a health hazard. Maybe, but I remain skeptical. Maybe ancient wheat was good, maybe it was neutral, maybe it was bad but not nearly as bad as the stuff sold today.  I still think it’s likely wheat has been provoking auto-immune reactions in susceptible people since the dawn of civilization.

But whether wheat was good or bad back in ancient times, the refined and fortified garbage sold today is a health hazard. On that we totally agree.  So unless you want to go out and find some ancient wheat (which Duck explains how to do in his post) and give it a try, my advice remains the same:

Don’t Eat Wheat.

Share

Comments 132 Comments »