Hiya, FatHeads!

Thought I’d post a followup report and let Tom focus on the book over the holiday weekend.

As I reported here, after reading “Born to Run” last year, I got interested in the idea of people being designed to run. Even old, fat people. So, with the encouragement of a couple of my coworkers,

(these ladies:)

I signed up for this year’s Abe’s Army training program, which consists of weekly organized small group runs with experienced runners, along with some personal miles logged, culminating 13 weeks later in participation in the 10k Abe’s Amble, which is run the last Sunday of the Illinois State Fair.  The race starts inside the fairgrounds at 7:30 am, heads out of the fair, through  nearby Lincoln Park, out the back of the park, through the (hilly) cemetery where Lincoln is buried, then back. BTW, for us non-metric types, 10k is 6.2 miles.

I missed a couple of the long group runs the last couple of weeks, but did running on my own, and I’d also starting pedaling the 2 miles to the office every day, so I felt like I was ready.

As an added bonus, it turned out that whoever organized the race this year must have some MAJOR contacts somewhere, because Saturday night Central Illinois broke out of a weeks-long string of 90-100 degree weather and we ended up with 65 degrees and overcast for the start of the race.

Here’s most of our group (Blue 2) just before the race started.

I’m the bright yellow one in the back with the funny “running shoes.”

About those — I’d been doing my personal runs in the Huaraches all along, but had been doing the group runs in a pair of Lems shoes that are zero-drop, barefoot shoes but look like running shoes — just to blend in a bit (the ones in the pic at the top of this post). I wore the Huaraches to the last practice run (3 miles), and when I walked up the trainers looked at my feet, then up at me, and said “so you’re not running tonight?” I explained to them that they were structurally no different than the ones I’d been wearing to the group runs. They were interested, asked about injuries, etc., but very cool about it.

When we got to one of the water stops that are set up around the training course runs, someone from another group who’d seen my footgear came up asked “how are your feet feeling in those?” I said “great – I’ve been running this way all along.” She said she hadn’t thought people could run like that. I replied that “really, we spent thousands of years being designed to run like this.” She said yes, that made sense, but “I see too many people with ankle and knees problems” (I believe she’s in the medical arena); to which I replied, “and I bet they all wear running shoes, right?” She smiled a bit at that.

Anyway, my goal all along had been to run the race in the Huaraches, and the last practice run showed me that it wouldn’t be a problem.

So off we all went — the Abe’s Army program had around 150 people, but there were nearly 650 participants for the Amble. I ran with a buddy from my group (they guy on the left in the group pic), and we decided to keep using our training protocol of 5:1 intervals for the race — run 5 minutes, walk 1 minute, repeat, until you cross the finish line.

We moved towards the back third of the pack at the starting line so we wouldn’t be in the way of the real competitors, but be ahead of the walkers and dedicated slowpokes. Here’s me as I get past the starting gate…

(I don’t really have to go to the bathroom — that’s just the way my shorts bunched up!)

At any rate, I was able to maintain a blistering 13:10 min/mile pace (1:21:41.4 final time). I even had a bit of gas left in the tank for the finish and sprinted the last 100 yards. Of course, many people would mistake my sprinting for “strenuous jog,” but I still felt really good about it — way better than it looks like I felt:

 

In the final standings, I whipped 84 of the other folks’ butts (including most, but not all of the Olympic walkers and almost everyone over 70), and had the other 559 in front of me looking over their shoulders.

Well, maybe not all of them. The mutant who won (this guy, Bryan Glass:)

 

 

(5:21 min/mile; 33:09.4 MINUTES) blasted past my buddy and me going the other way when we were approaching the 2 mile mark, so he’d already covered over 4 miles. He didn’t have to look over his shoulder — he could’ve seen me coming from two miles away!

Actually, calling Mr. Glass a mutant is a disservice.  I’m sure he’s got a good set of genes for running, but nobody can do that without training and focus beyond my imagination. He probably could catch his dinner ala “Born to Run.”

Me, I’m not selling my guns yet.

Four minutes behind him (and 44 minutes in front of me) was the first woman over the line. One of the interesting points in “Born to Run” was that the longer the distance, the closer women are to matching men.

 

 

It was a great experience, and it’s fired up my motivation to keep my activity level elevated. My running buddy and I are going to keep doing weekly runs; we’ve signed up for a 2 mile moonlight fun run/4 mile trail bike race in a couple of weeks; I ran 5k last weekend on vacation in Apple Canyon , IL (ALL hills!); I’m back doing resistance training once a week for the first time since my knee surgery last year; I’m biking to work; and I’m thinking of trying some swimming in the mornings at the local public indoor pool.

And besides all that, I got one of those “thanks for taking part” ribbons like Tom mentioned in his last Farm Report!

 

Icing on the cake, baby. Icing on the cake.

Cheers!

The Older Brother

Share

Comments No Comments »

Duck Dodgers (who posts comments here now and then) wrote a long post on the Free the Animal blog titled How Wheat Went From Superfood To Liability.

Don’t worry; he’s not encouraging you to toddle down to the Olive Garden for a bowl of pasta and stop for some (ahem) “whole-wheat” bread on the way home. His point, as briefly as I can state it, is that ancient wheat was a nourishing food — which we turned into garbage through modern milling and refining.

I enjoy Duck’s Free the Animal guest posts because he fires arrows at the sacred cows of paleo and low-carb.

What?! You enjoy that?!

Yes, I do. We don’t learn in an echo chamber. We learn by being challenged, and by being willing to change our minds. At one time, I believed all the horsehocky about saturated fat clogging our arteries, red meat causing cancer, etc. I changed my mind because people challenged my beliefs. Thank goodness they did.

I encourage you to read the entire post. Go ahead, I’ll wait …

Okay, with that out of the way (and in case you didn’t read the post), I’ll pluck some quotes and add my own comments. As you’ll see, I think Duck makes some excellent points, but I’m still not persuaded ancient wheat was a superfood.

So, how did cultures regard wheat and whole grains before the industrial revolution? According to the historical literature, wheat was not some kind of sub-par caloric filler or cheap energy. Every culture had its superfood and wheat was, hands down, the superfood of Western civilization. Whole wheat is not just calories and nutrients. It contains of all sorts of phenolics, carotenoids, sterols, β-glucan, resistant starch, inulin, oligosaccharides, lignans, and other phytonutrients. Much of the health benefits of wheat are believed to come from these phytonutrients.

Economist Thomas Sowell once said that when his students declared this or that to be good or bad, his next question was: compared to what?

Duck makes a convincing case that ancient wheat was far better than the refined garbage people eat today. But was a wheat-based diet healthy compared to a hunter-gatherer diet?

Anthropologist Jared Diamond famously called the switch to agriculture the worst mistake in the history of the human race, based largely on observations of human remains.  Some quotes from his article in Discover:

In some lucky situations, the paleopathologist has almost as much material to study as a pathologist today. For example, archaeologists in the Chilean deserts found well preserved mummies whose medical conditions at time of death could be determined by autopsy. And feces of long-dead Indians who lived in dry caves in Nevada remain sufficiently well preserved to be examined for hookworm and other parasites.

Usually the only human remains available for study are skeletons, but they permit a surprising number of deductions. To begin with, a skeleton reveals its owner’s sex, weight, and approximate age. In the few cases where there are many skeletons, one can construct mortality tables like the ones life insurance companies use to calculate expected life span and risk of death at any given age. Paleopathologists can also calculate growth rates by measuring bones of people of different ages, examine teeth for enamel defects (signs of childhood malnutrition), and recognize scars left on bones by anemia, tuberculosis, leprosy, and other diseases.

One straight forward example of what paleopathologists have learned from skeletons concerns historical changes in height. Skeletons from Greece and Turkey show that the average height of hunger-gatherers toward the end of the ice ages was a generous 5′ 9” for men, 5′ 5” for women. With the adoption of agriculture, height crashed, and by 3000 B. C. had reached a low of only 5′ 3” for men, 5′ for women.

At Dickson Mounds, located near the confluence of the Spoon and Illinois rivers, archaeologists have excavated some 800 skeletons that paint a picture of the health changes that occurred when a hunter-gatherer culture gave way to intensive maize farming around A. D. 1150. Studies by George Armelagos and his colleagues then at the University of Massachusetts show these early farmers paid a price for their new-found livelihood. Compared to the hunter-gatherers who preceded them, the farmers had a nearly 50 per cent increase in enamel defects indicative of malnutrition, a fourfold increase in iron-deficiency anemia (evidenced by a bone condition called porotic hyperostosis), a threefold rise in bone lesions reflecting infectious disease in general, and an increase in degenerative conditions of the spine, probably reflecting a lot of hard physical labor.

A six-inch crash in height, with a rise in dental defects and infectious diseases (bearing in mind that the Dickson Mounds natives were growing maize, not wheat).  Other anthropologists have made similar observations.  When we took up farming, our health declined.

To be clear, Diamond doesn’t argue that grains induced those problems directly. He writes that perhaps when humans became farmers, the crops squeezed out a more varied and nutrient-dense hunter-gatherer diet, leading to malnutrition.  But it’s clear that switching from a hunter-gatherer diet to a grain-based agricultural diet didn’t make us taller or healthier. Quite the opposite.

Back to Duck’s post:

Hippocrates, the father of Western medicine, not only recommended bread as a health-promoting staple, but he was keenly interested in experimenting with different preparations of wheat.

If wheat was so deleterious, you’d think that Hippocrates would have noticed it and warned against its consumption instead of recommending it for the prevention of disease.

Hippocrates was not alone. Avicenna recommended bread as a key staple of the diet. Paracelsus believed that wheat had mystical properties, and Aristotle thought foods made from wheat suits our bodies best. And, what we see over and over again in the historical literature is that wheat was once considered to be the most nutritious and most important edible plant in the entire vegetable kingdom. Bread was known as the Staff of life—it was the de facto superfood for agriculturalists.

Setting aside the appeal to authority, I’d ask the Sowell question again: compared to what? If Hippocrates was getting good results with his patients by having them substitute wheat for pork and green vegetables, then I’d say he was onto something. But we don’t seem to have that information. Maybe the wheat replaced swill.

Much of Duck’s post quotes doctors from previous centuries who recommended wheat as a health food. Okay, fair enough. That’s interesting at the very least.  But given how often established medical opinion has turned out to be wrong over the centuries, I wouldn’t consider it solid evidence that ancient wheat was a superfood and didn’t cause health problems.

In both of his Wheat Belly books, Dr. William Davis blames the gliadin portion of gluten for causing many, if not most, of what he considers to be wheat’s deleterious effects. The ability of gliadin to increase gut permeability has been well established in recent years and is not, as far as I know, controversial. (If you Google “gliadin intestinal permeability,” you can read from now until you retire.)

Duck’s main point in his post is that milling and refining wheat turned it into health-sapping garbage. I agree wholeheartedly. But unless ancient wheat didn’t contain gliadin or we were somehow protected against the effects on gut permeability, I suspect wheat has always had the ability to induce auto-immune reactions. Perhaps those reactions weren’t linked to wheat because everyone ate the stuff.

I’m reminded of something I read in The Emperor of All Maladies, a hefty book about the history of cancer: when a doctor first floated the idea that smoking causes lung cancer, the vast majority of other doctors and researchers scoffed. They continued scoffing for years.  As the author (an oncologist) explains, it’s been historically difficult for doctors to accept that something causes a disease if 1) nearly everyone is exposed to it, and 2) most of them never develop the disease.

At one time, nearly everyone smoked. Doctors smoked. The banker smoked.  Your neighbor smoked.  Your in-laws smoked.  It was considered normal behavior. Heck, everyone does it, and few of them develop cancer, so it can’t be the smoking. Move along, let’s find the real cause.

When reading that passage, I thought, Hmm, just like with wheat. Everyone eats wheat, so it can’t be bad for us.

At a dinner some years ago, a friend I hadn’t seen in ages asked why I was skipping the bread and pasta. When I told him, he was incredulous. What?! How can wheat possibly be bad for us? Almost everyone eats wheat! People have been eating wheat since biblical times!

Well, yes. But from what I remember of the Bible, healing the sick was one of the real crowd-pleasing portions of the Jesus show.

True, we’ve been eating wheat for as long as we’ve been civilized. We’ve also had diabetes, cancer, heart disease, psoriasis, asthma, arthritis and schizophrenia for as long as we’ve been civilized. Wheat may have caused or contributed to all of them – even if, as with smoking and lung cancer, no single one of those diseases afflicted most people.

Back to Duck:

In his book, Wheat Belly: Lose the Wheat, Lose the Weight, and Find Your Path Back to Health, Dr. William Davis claimed that modern hybrids of wheat are to blame for all modern health issues. However, this is not supported by the scientific literature—nor is it supported by France’s lower levels of chronic diseases despite considerably higher wheat intakes.

Ahh, those wacky French. Truth is, I’m not sure what to make of them. They’re twice as likely to smoke as Americans, but have lower rates of heart disease … yet I wouldn’t cite them as proof that smoking doesn’t cause heart disease. I suspect that the American diet of HFCS, refined flour and industrial seed oils creates a perfect storm for inducing disease, which the French avoid by shunning the HFCS and seed oils and embracing natural animal fats.  They might still be better off without the wheat.

Or perhaps someday we’ll learn that the French are healthier than us because spending an hour with your mistress before heading home for dinner with the wife and kids prevents nearly all chronic diseases. Chareva disagrees with that hypothesis and offered evidence that anyone who tests it will end up sleeping in a chicken coop.

Duck’s hypothesis is more interesting, despite not involving mistresses:

By 1953, Newfoundland had enacted mandatory fortification of white flour. By 1954, Canada and a number of US states had enacted the Newfoundland Law. Southern states in particular were eager to enact the law, to reduce pellagra, that had become prevalent during the Great Depression. These states typically mandated fortification of flour, bread, pasta, rice and corn grits.

In 1983, the FDA significantly increased the mandated fortification levels—coinciding with the beginning of the obesity epidemic. 1994 was the first year that obesity and diabetes statistics were available for all 50 states. Notice a pattern?

Fortifying flour may have ended the deficiencies of the Great Depression, but it appears to have significantly worsened chronic diseases.

Furthermore, wheat flour fortification may explain the popularity of non-celiac gluten sensitivity we see today in fortified countries (it was extremely rare prior to fortification). As it turns out, iron fortificants have been shown to promote significant gastric distress, even at low doses and pathogenic gut profiles in developing countries. Non-celiac gluten sensitivity is virtually unheard of in unfortified countries, like France, which consume 40% more wheat than Americans.

That’s the most eye-opening section of the post as far as I’m concerned. Before reading the brief history that Duck cites here, it never occurred me to that fortifying grain could make it worse. If gliadin didn’t cause gut permeability back in the day (still a big IF in my book), that could be the explanation.

As far as modern wheat goes, I’ve said this before, and I’ll say it again: Norman Borlaug, who was awarded the Nobel Prize for his part in developing semi-dwarf wheat, was a good man.  He set out to prevent mass starvation, and he succeeded. Given a choice between semi-dwarf wheat or watching my kids die of starvation, I’ll take the wheat every damned time.

That being said, I still believe semi-dwarf wheat is something those of us who aren’t starving should avoid. Duck makes a good case that milling, refining and fortifying wheat turned it into a health hazard. But the changes in semi-dwarf likely threw gasoline on that fire. Here’s a quote from Wheat Belly Total Health:

One important change that has emerged over the past 50 years, for example, is increased expression of a gene called Glia-α9, which yields a gliadin protein that is the most potent trigger for celiac disease. While the Glia-α9 gene was absent from most strains of wheat from the early 20th century, it is now present in nearly all modern varieties.

Now let’s mill it, refine it, and fortify it. Awesome.

Dr. Davis believes the change in the gliadin gene is the reason celiac disease has increased by 400% in the past 50 years — and that’s a genuine increase, by the way, not a case of better diagnosis.  Researchers realized as much when they compared blood samples from 50 years ago to recent blood samples.  The modern samples were four times as likely to contain antibodies triggered by celiac disease.

Duck, on the other hand, believes fortification is the likely culprit.  It’s an interesting possibility.

Back to Duck:

Nor does Dr. David Perlmutter’s book, Grain Brain: The Surprising Truth about Wheat, Carbs, and Sugar–Your Brain’s Silent Killers, explain how humanity enjoyed its highest levels of intellectual achievement while largely eating wheat and other grains as staple foods—enjoying unprecedented population growth and longevity as well.

I can explain that one. In a previous post, I mentioned Conquests and Cultures, by Thomas Sowell. One of the book’s main points is that economic specialization is required for cultures to advance. If pretty much everyone has to hunt and gather food, there will be no pianos, printing presses, telescopes or steam engines. There’s no doubt that agriculture led to economic specialization, and thus civilization and intellectual achievement.

But that doesn’t prove eating grains had a positive or even a neutral effect on our brains. It simply means that in a civilization where farming allows most people to do something else, Mozart becomes a composer and Voltaire becomes a writer. In a paleo society, Mozart is the hunter who sings those amazing songs around the campfire, and Voltaire is the hunter whose clever stories amuse his pals during the long walks home from a hunt. They may have had genius IQs, but we’ll never know. We do know that human brains have, in fact, been shrinking since their peak size roughly 20,000 years ago.

Another point Sowell makes in Conquests and Cultures is that civilizations advance through cross-pollination of ideas, technologies and resources. Throughout history, cross-pollination was often the result of large-scale conquest. (Sowell doesn’t ignore or excuse the brutality of conquest, by the way.)  Conquering an inhabited territory requires a large army (another example of economic specialization), which requires a large population, which requires agriculture.

In Europe and the Middle East, the “crop of conquest” was wheat. In the Western Hemisphere, it was maize that enabled the Aztecs and Mayans to build cities and raise armies large enough to establish empires. But again, that doesn’t prove the conquerors were healthier or smarter than the tribes they subjugated. It only proves that farming enabled them to raise and feed large armies.

Okay, time to wrap up. This is already a long post about a long post. To summarize:

Duck believes ancient wheat was a nutritious food, not a health hazard. Maybe, but I remain skeptical. Maybe ancient wheat was good, maybe it was neutral, maybe it was bad but not nearly as bad as the stuff sold today.  I still think it’s likely wheat has been provoking auto-immune reactions in susceptible people since the dawn of civilization.

But whether wheat was good or bad back in ancient times, the refined and fortified garbage sold today is a health hazard. On that we totally agree.  So unless you want to go out and find some ancient wheat (which Duck explains how to do in his post) and give it a try, my advice remains the same:

Don’t Eat Wheat.

Share

Comments 94 Comments »

Interesting items from my inbox and elsewhere …

The Many Uses For Hogs

I’m a big fan of the hog (when they’re not smacking me around in a chute, that is), but I had no idea they’re this useful:

When we tuck into a bacon sandwich, few of us wonder what has happened to the other parts of the pig whose life has been sacrificed so we can enjoy a juicy breakfast.

But one inquisitive writer set out to trace where all the body parts of one porker ended up.

Christein Meindertsma, 29, said: ‘Like most people, I had little idea of what happens to a pig after it leaves the abattoir so I decided to try to find out. I approached a pig farmer friend who agreed let me follow one of his animals.’

Identified by its yellow ear tag number, 05049, her pig trail ended with her identifying an incredible 185 different uses to which it was put – from the manufacture of sweets and shampoo, to bread, body lotion, beer and bullets.

Virtually nothing in a pig goes to waste. The snout from Pig 05049 became a deep-fried dog snack, while pig ears are sometimes used for chemical weapon testing due to their similarity to human tissue.

Tattoo artists even buy sections of pig skin to practise their craft on due to its similarity to human skin, while it is occasionally used with burns patients for the same reason.

I’m starting to feel a bit chagrined that all we got from our hogs was 500 pounds of meat. I could have been practicing to become a tattoo artist while covering myself with body lotion, drinking a beer, and firing some bullets at a loaf of bread.

Bacon Laser?

We may need add another use for hogs to the list:

A team of Harvard scientists has paved the way for a deadly laser pig weapon by demonstrating that, with a little encouragement, pig fat cells can be made to lase.

According to MIT Technology Review, Seok Hyun Yun and Matjaž Humar stimulated spheres of fat inside porcine cells with an optical fibre, causing them to emit laser light.

And here I thought my belly was glowing because I’m happy.

Handily, pig cells contain “nearly perfectly spherical” fat balls, which are conducive to lasing by resonance when supplied with a suitable light source. The team has also cheated the effect by injecting oil droplets into other cells.

Seok Hyun Yun, lead author of the report which appears in Nature Photonics, reckons an ultimate use of his work might be to deploy “intracellular microlasers as research tools, sensors, or perhaps as part of a drug treatment”.

Drug treatment, my foot. Let’s put all the research dollars into that deadly laser pig weapon. Imagine if we have troops overseas in some future war:

“Achmed, what’s that smell coming from the American lines? Is it …?”

“Yes! I believe it’s BACON! Run! Run before they turn the pig-laser on us!”

And if would-be intruders are scared away from my house by the aroma of saugage, that’s fine by me.

Eggs With My Pig Laser, Please

In my Science For Smart People speech, I mentioned that when some researchers find a correlation in an observational study, they assume they’re looking at cause and effect. I gave the example of a meta-analysis which prompted the lead researcher to announce to the media, “The studies showed a significant increase of new onset diabetes with regular egg consumption.”

Sure sounds like cause and effect, doesn’t it?  Based on other interviews, that’s indeed what the researcher believes.  But of course, if eggs actually caused diabetes, we wouldn’t see observational studies like this one:

Men who ate more than five eggs a week had a lower risk of developing type 2 diabetes than men who ate about one a week only, according to researchers in Finland.

In a study with an average follow-up of almost 20 years with 2,332 participants, researchers noticed that those in the highest quartile for egg intake had a lower risk of developing diabetes than those in the lowest quartile when cholesterol and other factors were controlled for.

Yunsheng Ma, MD, an associate professor of medicine at the University of Massachusetts Medical School, said in an email to MedPage Today that the study “provides welcome news to support the 2015 dietary guidelines, which are expected to drop the limit of egg consumption for blood cholesterol concerns.”

Ma said that he was aware of six studies that examined egg consumption and diabetes. One showed an increased risk, he said, and the other five showed no association. “So these results are not in line with other findings,” wrote Ma.

So here’s the official score in Observational Study Stadium: one study shows a higher risk of diabetes with higher egg consumption, one shows a lower risk, and five show no association at all. That means there’s no cause-and-effect relationship, period. Any good science teacher could tell you that.

Speaking of which …

A Science Teacher’s Opinion of Super Size Me

Several people besides me have demonstrated they could lose weight while eating nothing but fast food. The latest happens to be a science teacher:

Iowa high school science teacher John Cisna weighed 280 pounds and wore a size 51 pants.

Then he started eating at McDonalds.  Every meal. Every day. For 180 days.

By the end of his experiment, Cisna was down to a relatively svelte 220 and could slip into a size 36.

Unlike me, Cisna didn’t embrace a high-fat diet:

Cisna left it up to his students to plan his daily menus, with the stipulation that he could not eat more than 2,000 calories a day and had to stay within the FDA’s recommended daily allowances for fat, sugar, protein, carbohydrates and other nutrients.

I much prefer my “@#$% the government recommendations” diet. But I definitely enjoyed Cisna’s comments about Super Size Me:

“As a science teacher, I would never show ‘Super Size Me’ because when I watched that, I never saw the educational value in that,” Cisna said. “I mean, a guy eats uncontrollable amounts of food, stops exercising, and the whole world is surprised he puts on weight?’

“What I’m not proud about is probably 70 to 80 percent of my colleagues across the United States still show ‘Super Size Me’ in their health class or their biology class. I don’t get it.”

I get it. They like the anti-McDonald’s message, so they toss critical thinking out the window … assuming they had any critical-thinking skills to toss.

It’s 2015 … So Let’s See How the ‘90s Viewed the ‘60s

I never watched the TV show Quantum Leap, but a reader sent me a link to this YouTube clip. It’s part of an episode in which the main character visits his parents in 1969. Skip ahead to the 12-minute mark:

The episode aired in 1990. That’s right about when arterycloggingsaturatedfat! hysteria was in full swing. The main character goes back in time and is horrified by all the fat and cholesterol his father is eating. Now we can go back in time and be horrified by the fact that the main character is horrified.

Junior was right about one thing, though: Dear Old Dad needs to stop smoking.

If I could go back in time, I wouldn’t tell my dad to stop eating eggs and butter. I’d tell him to give up sugar and stop taking those @#$%ing statins. His 81st birthday would have been tomorrow, and man, I wish I could call him up, rib him about getting old, then wait for one of his witty comebacks.

It’s 2015 … So Everything Good Must Be Candy

This isn’t from a news item. It’s something I’ve noticed in a handful of TV commercials: vitamins and even fiber tablets now come in the form of gummies– for adults. I didn’t find a commercial online, but I did find this:

So apparently some people won’t take vitamins unless they taste like candy. If that’s not a sad comment on our dietary habits, I don’t know what is.

Share

Comments 29 Comments »

Autumn doesn’t officially begin until September 23rd, but to us, it feels like summer ends on the last day of the Williamson County Fair. The nine-day event takes place at the county agricultural center and includes the 4-H sponsored farm and livestock shows. There are the usual farm and ranch animals …

… and some not-so usual farm and ranch animals:

If memory serves, that’s a llama in front and an alpaca behind it.

At last year’s show, Sara got to show her goats, walking them around as part of the competition. One of the goats won first place in its division, and Sara ended up with some nice prize money. This year Alana showed five of the chickens she raised (with a LOT of help from her mom). It was an easier job than showing goats; all she had to do was put them on display in a cage.

The chickens all receive a first-place, second-place, or participation ribbon. The participation ribbon, of course, means “thanks for taking part.” Alana’s chickens received a first-place ribbon, so they fetched a higher price at the auction: $23 each.

Alana also entered a country ham in the show. To be honest, I didn’t know what a country ham was. I figured if we buy a ham downtown and drive it out to our place, it’s now a country ham. Turns out a country ham is one that’s cured by being packed in salt and hanging for several months. Alana’s ham received a “thanks for taking part” ribbon. Oh well. She didn’t seem bothered by it, and I expect we’ll enjoy the ham just as much either way.

On the way into the fair, I spotted some animal-rights protesters carrying signs:

I was snapping pictures while driving, so I didn’t plan the shot above, but it’s a nice accident.  Animals Are Not Entertainment, and right above that, Business Entrance.  No, goofball, these animals are not entertainment; they’re food, and for many of the people who raise them, a business.

The other signs featured tigers, so I’m guessing the protesters don’t know much about livestock shows. I’m not an animal-rights sort, but if one of my neighbors were raising tigers, I’d be a wee bit upset.

“Uh, Joe, about those tigers you’re keeping. Are you sure that’s safe? You know, there are kids and dogs and lots of farm animals around here, so—“

“Of course it’s safe. I keep them right over there in that … all right, which one of you kids left the cage door open?!”

We’re not cruel to our animals, of course.  As far as their only crime was being born … well, here’s the thing:  when it comes to livestock animals, the only reason they’re born in the first place is that they’re useful to human beings.  If you freed most farm animals, they’d become food for predators.  Then they’d go extinct.  If you want to raise your vegan-approved crops without a ton of chemicals and fossil-fuel fertilizers, you’d better hope people continue keeping livestock.

I just finished listening to Conquests and Cultures, by the always-awesome Thomas Sowell – same professor of economics who wrote The Vision of The Anointed. In Conquests and Cultures, he explains that some North and South American Indians were farmers long before Europeans arrived, but there was a major difference. The Indians didn’t have horses or oxen to pull plows, which of course made farming more labor-intensive. But more importantly, without domesticated animals living on the land, there was no manure to enrich the soil. Consequently, each field could only produce maize or other crops for a few years before becoming depleted. Then the farming Indians had to move and find fresh fields.

I can attest to how well domesticated animals enrich the soil. When we moved our older chickens to the back of the property in the spring, their chicken-yard in the front pasture was bare. They’d pecked the grass and other vegetation down to the dirt.

I didn’t expect anything to grow there for a long time.  Boy, was I wrong. Mere months later, it looked like this:

The stuff was amazingly dense. Chicken poop must be excellent fertilizer indeed.

If you look carefully at the picture above, you can see Chareva trying not to get lost in the jungle. She’s facing the area that was completely bare.

You might also notice the net had sagged and weeds were growing up through it. I found that part of the overgrowth particularly annoying, because my errant disc-golf shots were getting trapped instead of sliding off the net. Walking into that bug-infested jungle to bounce a disc off the net wasn’t my idea of a good time.

So this weekend, we decided to tackle the chicken-yard. I tried running the brush mower we nicknamed The Beast in there, which worked for a while. Then the blade stopped turning and smoke poured out from the side. The overgrowth was so thick, it managed to jam up the blade and (I believe) either snap a belt or shove it off track. It takes some mean weeds to defeat The Beast.

I was more interested in tackling the chicken-yard than doing repair work, so I grabbed my Weed-Eater with the brush blade and hacked my way through the growth. Chareva stood behind me with a broom and tried to keep the net from grabbing my helmet and knocking it off. That almost worked now and then.

I plan to fix The Beast, then give it another go around the chicken-yard. Then I’ll let the weeds dry out for a week before taking a lawn mower in there to see if I can mulch the stuff. I sure as shootin’ don’t want a low-hanging net yanking at my head when I go in there again, so we spent part of Sunday raising it. This time, instead of spindly wood with a crossbeam on top, I went with the galvanized pipes and PVC junctions that worked so well in our chicken yards out back.

When we were done, Chareva surveyed the situation: enclosed space, a high net to discourage deer or birds from eating whatever is in there, and amazingly fertile soil. She hasn’t decided what, but she’ll definitely plant something there in the spring. I’m voting for tiger nuts.

Meanwhile, we have 63 chickens out back supplying us with eggs and fertilizer.

Share

Comments 44 Comments »

Must be a function of my age … as of a couple of days ago, I was still one of the few people on the planet who had never heard the song “All About That Bass.” What can I say? I’m old and I rarely listen to pop music. When I create our family end-of-the-year DVDs, I have to ask my daughters to suggest songs for the music videos. Otherwise, every DVD would be like a tribute to the golden oldies.

I was familiar with the song’s melody because Sara took a liking to a YouTube parody about ancient Greece, which was amusing because … well, heck, rather than describe it, I’ll embed the video:

The tune got stuck in my head, probably because Sara has been creating instant parodies of her own to comment on various situations.  For example, after the hundredth or so time Chareva mentioned elderberry bushes, Sara began singing, “Because she’s all about that bush, ’bout that bush — elderberry.  She’s all about that bush, ’bout that bush — elderberry.”

So a couple of days ago, I decided to look for the original song on iTunes. I liked what I heard, so I listened to samples of Meghan Trainor’s other songs. I liked them too, so I bought the iTunes album. Then later in the day, while taking a work break, it finally occurred to me to check if Trainor had produced a music video of “All About That Bass.”

Wow, she sure did — and it’s racked up nearly a billion views. That’s billion … with a b. Here it is, in case (like me) you’ve been living under an age-induced rock and haven’t seen it:

I freakin’ love this thing. The lyrics, the melody, the beat, the harmony, the instrumentation, all of it. And I love the body-acceptance message the video puts out there. That’s a message Chareva and I want to include in the book we’re producing for kids – perhaps the closing message.

Yes, you can improve your body composition with a good diet and the right kind of exercise. But most of us will never look like jocks or models, no matter what we do. I spent much of my early life feeling ashamed of the skinny, weak arms and legs extending from my fat-bellied body, complete with boy-boobs. That shame was a waste. A complete and utter waste.

I sometimes wish I could go back in time and have a long talk with that kid. I’d let him cry on my shoulder for awhile, then explain to him what actually matters in life – and it’s not the shape of his body. I can’t do that, so I’ll have a talk with kids who read the book. I want them to know they can’t compare themselves to people who were born to look like athletes or models. I want them to understand what a mistake it is to think, “Someday, with enough work and sacrifice, I’ll look like that – and then I’ll be happy.”

Anyway, I’ve listened to the song and chuckled my way through the video several times, enjoying Trainor’s message of “Every inch of you is perfect from the bottom to the top.”

Then today, I read some of the YouTube comments. I’m sorry to say there’s a sizable contingent of morons and ignoramuses among the millions of people who’ve seen the video. Here are some samples, in all their grammatically-challenged glory:

why cant you make people feel bad about the body, if it isnt any permanant disability where they cant do anything about it? fatties are fat because they cant spend enough time to exercise and cant refrain themselves from eating chunk loads of junk food its their fault.

Nice to finally hear from people who have carefully studied the issue … although I could swear I recently read a journal article that concluded: The commonly held belief that obese individuals can ameliorate their condition by simply deciding to eat less and exercise more is at odds with compelling scientific evidence.

This over-autotuned,IQ-lowering,grotesque shit made by someone who looks like a chubby 40-yo tranny – almost a BILLION??? Gotta be shittin me…

If this is an IQ-lowering video, then my guess is that you’ve already watched it at least a hundred times.  Best stop now … before you can’t tell the difference between, say, Tom Brady and a 40-year-old tranny.

This is just wrong. Self-criticism is crucial for surviving. Now she promotes being fat. What comes afterwards ? Will she be responsible for rising levels of deaths because of diabetes and heart strokes? Don’t think so.

Yes, that’s what prevents fat people people from being thin: they don’t criticize themselves enough.

F***ing fatass ugly bitch and what a f***ing stupid song.

Something tells me Trainor will manage to have a stellar career despite your opinion.  Perhaps the “something” is the almost-billion views.

So where’s the song for short men? Oh wait, this fat bitch probably has a 6 foot minimum requirement like all other women and doesn’t see the hypocrisy. You can’t grow taller, but you can burn off that fat ass bitch.

That’s the first thought that occurred to me as well:  What, no lyrics praising short men?!! Clearly, Trainor only dates tall men, like all other women!  What a hypocrite!  I’m not six feet tall either, so maybe you and I should meet for drinks in a pub someday and share our feelings about women who don’t write songs for short men and other height bigots.  I’ll bring a booster seat for you.

Once a year in the US, UK, and Canada they discover a fat chick that is able to sing and she’s like the hottest thing for about 2 months telling everyone how great it is to be fat and you should be proud of being fat. Enjoy your diabetes and heart surgery. No worries, Megan says it’s OK.

Dang, maybe I need to get stronger glasses.  I’ve watched the video several times, and I haven’t spotted the fat chick yet.  Is she hiding behind those dancers with the big leg muscles?

Nobody likes fat bitches stop trying to convince yourself otherwise.

No worries, sir.  I’m sure there are still plenty of women in the world who meet your standards — and with that attitude of yours, you’ll no doubt make one of them a fine husband.

Okay, you get the idea. There were also laughable complaints that Trainor is insulting thin girls and and saying they’re not sexy, which will make them feel bad about themselves.  Oh yeah, huge risk there.  Given the current culture, I’m reasonably sure thin women won’t be suffering from size-related self-esteem issues any time in the next decade, no matter what lyrics Trainor writes for her songs.

If you’ve listened to a few podcasts where I was the interview guest, you probably know what eventually became Fat Head started out in my mind as a short piece on how we treat fat people in America. They’re perhaps the only remaining group you can make a target of nasty, bigoted remarks without being run out of town. It’s their fault, they’re disgusting, they did it to themselves, they could be thin if they wanted to, blah-blah-blah. Some of the YouTube comments prove the point rather clearly.

But a music video doesn’t draw nearly a billion views unless it strikes a chord with millions and millions of people.  I believe this one struck a chord for a good reason.

You go, Meghan … and thanks for the great song.

Share

Comments 112 Comments »

I’d have to dig through my Outlook archives to say for sure (and I won’t), but this one may have set the new record for the number of Did you see this?! emails I received.

If you follow the health news (and if you haven’t been on a retreat in the wilderness or otherwise deprived of the internet for the past week), you already know a new study declared that low-fat beats low-carb for weight loss … once and for all, end of story, final word, move along folks, there’s nothing else to see. Let’s look at some media treatments of the news.

From a BBC article titled Low-fat diets ‘better than cutting carbs’ for weight loss:

Cutting fat from your diet leads to more fat loss than reducing carbohydrates, a US health study shows.

Scientists intensely analysed people on controlled diets by inspecting every morsel of food, minute of exercise and breath taken. Both diets, analysed by the National Institutes of Health, led to fat loss when calories were cut, but people lost more when they reduced fat intake.

From a Washington Post article titled Scientists (sort of) settle debate on low-carb vs. low-fat diets:

Seeking to settle the debate, scientists from the National Institutes of Health set up a very detailed and somewhat unusual experiment.

They checked 19 obese adults (who were roughly the same weight and had the same body-mass index) into an inpatient unit at the NIH clinical center, for two-week increments.

For the first five days of each visit, the volunteers were given a baseline diet of 2,740 calories that was 50 percent carbohydrate, 35 percent fat and 15 percent protein. This wasn’t very different from what they were eating before. But for the following six days, they were given either a low-fat diet or a low-carb diet, each having 30 percent fewer calories. Each participant was also asked to exercise one hour a day on the treadmill.

After analyzing everything from how much carbon dioxide and nitrogen they were releasing to their hormone and metabolite levels, the researchers concluded that the calorie-per-calorie, low-fat diets beat out low-carb diets.

My favorite headline was from the Los Angeles Times: For fat loss, low-fat diets beat low-carb diets handily, new research finds.

Low-fat won handily? Must’ve been real butt-whippin’ demonstrated in those results.

It is a central dogma of the low-carb lifestyle: that while avoiding carbohydrates will force the human body into fat-burning mode, any diet that fails to suppress insulin will trap body fat in place and thwart a dieter’s hope of shifting to a leaner, healthier body type.

But researchers from the National Institutes of Health have found that the hallowed creed of Atkins acolytes doesn’t hold up in the metabolic lab, where dieters can’t cheat and respiratory quotients don’t lie.

So it was the Atkins diet that got a butt-whippin’ by low-fat. I repeat: The Atkins Diet. I don’t know about you, but I would take that to mean the diet prescribed by Dr. Atkins.

And how long did the diets last? Let’s check the LA Times again:

As the 19 subjects recruited for the current study dieted their way through four weeks of low-carb and low-fat regimens, Hall and his colleagues conducted brain scans and other tests to glean how diets with differing nutrient compositions affected their mood, motivation and sense of satisfaction.

My goodness … they dieted their way through four weeks of low-carb and low-fat regimens, according to the LA Times. I take it that means the subjects were on diets lasting four weeks.  That ought to be long enough for real differences to emerge.

But wait a second … I seem to recall the Washington Post describing the diets a bit differently …

But for the following six days, they were given either a low-fat diet or a low-carb diet, each having 30 percent fewer calories.

Hmmm, we seem to have conflicting stories here.  Four weeks vs. six days on each diet.  Perhaps we should check the study itself – which I did. After reading it, I suspect we have a case of “let’s design a study to produce the results we want.” In fact, I can’t help but imagine the conversation:

“Okay, Jenkins, grab your laptop and step into my office. We need to design a good, solid, scientific study to settle this low-fat versus low-carb issue once and for all.”

“Excellent, sir. You mean in a metabolic ward and everything?”

“Exactly. Let’s start with the low-fat portion.”

“Well, sir, the usual definition of a low-fat diet is less than 30 percent of total calories, so I suppose we should—”

“Don’t be ridiculous, Jenkins. If we’re going low-fat, let’s really go low-fat!”

“Ahh, I see. Something like the very-low-fat diet Dr. Ornish pushes. Okay, 10 percent of total calories, then.”

“Damnit, Jenkins, you’re not listening! I said really low-fat! Let’s go down to, say, 7.7 percent of total calories from fat.”

“So a diet nobody would ever follow voluntarily in real life for any length of time, then?”

“Correct. Now, for the low-carb side of things …”

“That’s easy, sir. The Atkins books recommend starting at 30 grams of carbohydrate per day, so—”

“Good grief, man, we can’t put human beings on such an extreme diet!”

“But—”

“So we’ll go with 140 grams per day, including, say, 37 grams of sugar. That should be a fair comparison for our purposes.”

“But that’s twice as many carbohydrates per day as the Atkins diet recommends even in the maintenance stage, must less when starting a—”

“Well, it’s complicated, Jenkins, so let me explain it this way: shut up.”

Actually, the explanation isn’t particularly complicated. Here’s a quote from the full study:

Given the composition of the baseline diet, it was not possible to design an isocaloric very low-carbohydrate diet without also adding fat or protein. We decided against such an approach due to the difficulty in attributing any observed effects of the diet to the reduction in carbohydrate as opposed to the addition of fat or protein.

In other words, they didn’t want to add or remove protein from either diet, and they didn’t want to add fat to the low-carb diet or carbohydrates to the low-fat diet. They wanted to compare restricting carbs to restricting fat with no other changes, period.

Okay, fine. But in that case, the “low-carb” diet is nothing like the low-carb diet recommended by the Atkins diet books, or by any doctors who promote low-carb diets. So the accurate conclusion and/or headline should be something like Extreme low-fat diet produces more fat loss than a sort-of, kind-of, almost-low-carb diet … at least when the diets last six days.

Yup, six days. It was the Washington Post’s description of the duration that was accurate. The L.A. Times got it wrong. Based on those six days, the researchers then describe in their paper how computer models predict substantially more fat loss for the low-fat group if both diets lasted six months.

Uh-huh. I rank that up there with Al Gore claiming his computer models can accurately predict the climate in 2050 … even though those models didn’t accurately predict the previous 10 years. I’m a programmer, so trust me on this: computer-simulation models tell you what you tell them to tell you. The only way we’ll actually know how these diets perform over six months is to keep people on them for six months.

And we’d also want more than 19 people involved. I just wrote a post last week demonstrating how random chance alone can create “significant” differences in small study groups. Nineteen people, diets that lasted a whopping six days … I wouldn’t bet on those results being reproduced with large groups over a long time.

But about those results … the Los Angeles Times assured us the low-fat diet beat the (ahem) “low-carb” diet handily. So what were the big differences in outcomes?

Well, people on the low-fat diet lost (on average) 1.296 pounds of body fat. People on the (ahem) “low-carb” diet lost (again, on average) 1.166 pounds of body fat. The difference was therefore just a shade over one-tenth of one pound. If you don’t believe random chance can produce that trivial of a difference in a study group of 19 people put on diets lasting six days, I suggest you take a class in statistics.

But wait … did I say 19 people? Well, that’s not quite true. According to the paper, 19 people were enrolled in the study – 10 men and nine women. The study had a crossover design, meaning everyone goes on one diet, then goes back to normal eating for a couple of weeks, then goes on the other diet. They’re randomly assigned to do one diet or the other first.

But the results table shows n=19 for the (ahem) “low-carb” diet and n=17 for the low-fat diet. That means two of the subjects didn’t complete the low-fat diet. So I can’t help but wonder why the researchers didn’t simply toss the results for those two people from the study altogether. Why calculate their results on the (ahem) “low-carb” diet into the average if they didn’t finish the other diet? I thought the goal here was a head-to-head comparison of the same people on different diets.

I also can’t help but wonder why, given the small group, the researchers didn’t just show us the full results for everyone. In studies with hundreds of subjects, sure, you pretty much have to present group-average results to make sense of the numbers. But for the 17 people who completed both diets, heck, just show us everyone’s results and stick the averages at the bottom of the table.  If some individuals lost a lot more weight on low-fat vs. low-carb or vice versa, that would be worth knowing.  It would also be worth knowing if one or two outliers skewed the averages for the groups.

Well, apparently that did happen.  I found this in the paper:

The data were analyzed using a repeated-measures mixed model controlling for sex and order effects and are presented as least-squares mean ± SEM. The p values refer to the diet effects and were not corrected for multiple comparisons. One female subject had changes in DXA % body fat data that were not physiological and were clear outliers, so these data were excluded from the analyses.

Uh … okay.  I’d sure like to see those individual results, though.

All those complaints aside, there were some interesting results in the study tables (again, keeping in mind the small groups and short durations). During the six-day diets, triglycerides dropped by 17.5 points in the (ahem) “low-carb” group, and by 4.3 points in the low-fat group. For total cholesterol, the drop was 8.47 points in the (ahem) “low-carb” group and 19.1 points in the low-fat group. HDL dropped by 2.67 points in the (ahem) “low-carb” group and by 7.27 points in the low-fat group.

So if low triglycerides and high HDL are indicators of heart health (and if these results are actually meaningful), I’m sticking with a lower-carb diet … but one with more fat, thank you very much, because I want my HDL to go up, not down.

The results I found most interesting were for glucose and insulin. In the (ahem) “low-carb” group, glucose dropped by an average of 2.69 points … but in the low-fat group, glucose dropped by 7.1 points. So it’s clearly possible to reduce glucose levels with a very low-fat diet, despite the high carb intake, if calories are restricted enough.

This study has been hyped by the anti-Taubes brigades as a refutation of the insulin hypothesis, but the tables show very little difference in insulin levels. The (ahem) “low-carb” group showed a drop in fasting insulin of 2.76 points, while the low-fat group showed a drop of 2.04 points. Nonetheless, here’s how the researchers described the difference:

The experimental reduced-energy diets resulted in substantial differences in insulin secretion despite being isocaloric.

Hmmm … I’m thinking there’s a reason they chose the word “substantial” instead of “significant.” Let’s check the tables again …. Yup, the p value (RC versus RF) for the change in fasting insulin is .48. The threshold for “statistically significant” is .05 or below. So the difference here wasn’t even close to significant — in a study some people are waving around as proof that insulin levels aren’t a factor in the ability to lose body fat … perhaps because they read what the researchers wrote in their conclusions instead of checking the study tables.

And by the way, the p value (RC versus RF) for change in body fat was .78 — so unless I’m misinterpreting the meaning of p value (RC versus RF), we would interpret that as “statistically, the odds of this difference being due entirely to chance are 78 percent.”

Within the obvious limitations, the study does show that restricting calories can produce a drop in insulin even when the overall carb count stays the same. So it’s not as simple as carb intake = fasting insulin level. Total energy intake figures into it as well.

That being said, it would be very, very interesting to see what the differences in insulin levels (among other results) were if 1) the study ran much longer, 2) there were more than 17 people who completed both diets, and 3) the “low-carb” diet was actually low-carb and didn’t include 37 grams per day of sugar.

I believe the less-hype, more-substance reporting on the study was in an article that appeared in Forbes magazine online:

But a well-controlled new study finds that – at least in the lab – low-fat might be slightly better for weight loss over the long term. That does not mean that we should all revert to the low-fat insanity of the ’80s and ’90s. Rather, the more valuable take-home message might be that rejecting carbs may not be so necessary for long-term weight loss as many of us believe, and that a nutrient-balanced diet is probably the smarter strategy in the long term.

And frankly, whatever kind of diet is most doable for an individual is probably the one to be on. If it’s easier to stick to low-carb than low-fat, then by all means do it. But a balanced diet is still king.

Bingo. It’s certainly possible to lose weight on a high-carb, very-low-fat diet. It’s possible to lose weight on any diet if you restrict calories enough. I tried a Pritikin-style diet (10% of calories from fat) twice, and lost a bit of weight both times – and then I had to quit both times because I was miserable, hungry all the time, and eventually felt too lethargic and depressed to continue. Meals were an exercise in monkish discipline, choking down tasteless food and trying to convince myself I was fine with it.

Now I’m not miserable, not hungry all the time, and never depressed … which means when people hype a study like this as “proof” that a low-fat diet is better than an (ahem) “Atkins” diet, I can enjoy a hearty laugh.

Share

Comments 83 Comments »