Wednesday, October 29, 2014

Walkabout wednesday: Jetlagged mice, Calculus and population models.

Mice with jet lag, the effect of adding Calculus to a F&W curriculum, and population models in adaptive management. Also, I think I've been bloggin in my PJs, oops

Tuesday, October 28, 2014

Evolving one's diet

I recently heard an interview of Dr. Steven Gundry over at Jimmy Moore's Livin' La Vida Low Carb site. Gundry said a few things that intrigued me, so I felt I should try and follow up by reading his book, "Dr. Gundry's Diet Evolution" [1].

Gundry's basic premise is that our diet affects which genes are turned on, and which off. That's not earth-shattering, the notion that environment shapes gene expression is pretty old hat. And most of what Gundry introduces in the first chapter is well supported stuff from the literature. But there's one thing that doesn't have any citations, and that's his notion that there is a genetic program designed to kill us off once we've served our purpose: transmitting our genes to the next generation. 

This "killer gene" program got my attention because there's a new hypothesis for the evolution of aging  in the evolutionary literature, and that is called the "programmed death" hypothesis. The basic idea is that our genes articulate a "use by" date, and once we're past it all sorts of things come unglued. In models at least, the advantage appears to be an increase in a lineage's ability to evolve [2]. There are species that do not age, and cell lineages that also don't age. 

Gundry's assertion is that once we've reproduced our genes prefer we get out of our offspring's way. He brings up a few interesting observations, like why do clots in our arteries form in the heart, brain and legs, but not say, in our noses and ears? It's almost as if they're targeting essential survival systems ... Gundry's idea is that our modern western diet, full of refined sugars and starches, is so good at getting us to reproduce quickly that our genes then quickly kill us off to reduce competition for our offspring. 

I'm willing to grant the programmed death idea as a hypothesis, but Gundry takes it to extremes. For example, he asserts that overeating until you are overweight also triggers the killer genes because "your genes percieve [this] as threatening the food supply of others." [page 16] That's just one example of how he has bought into a group selection argument for the existence of these killer genes. And convincing your genes that you're not a threat is the basis for phases 2 and 3 of his diet plan. 

Phase 1 of Gundry's plan is pretty standard -- drop the refined carbohydrates and starchy vegetables. Where he differs from a LCHF plan is by focusing on lean protein as the replacement for carbs, rather than fat. The logic is that you're trying to convince your body that "Winter's here", and animals would be lean in winter. There are a few little inconsistencies though. The key "Gundryism" here is "If it's white, out of sight", like sugar, flour and WHAT MAYONNAISE! There's no rationale for that, but in an earlier chapter he mentions that soybean oil (the main ingredient in commericial mayonnaise) should be avoided because of a high Omega-6 : Omega-3 ratio. OK, but then why is tofu listed as an acceptable protein? It's actually white too, now that I think about it. This "teardown" phase is supposed to last until your weight loss plateaus, and you're not showing signs of illness. 

In Phase 2, "Rebuilding", the main emphasis is on reducing the protein and increasing the green stuff. This is meant to imitate the diet of our pre-agricultural ancestors. He draws on the diet of a silver-back gorilla as inspiration -- lots of calorie sparse leaves*. The biological hypothesis in this section seems to be that eating protein raises metabolism, and a high metabolism is bad. So eat less protein, lower the heat, and you'll be better off for it. He also cites the China Study as support for this perspective. If you're still thinking that's a good idea, you should read the critiques by Denise Minger or Chris Masterjohn. Sorry Gundry, you're loosing me here.

In Phase 3, "Longevity", the goal is to keep those gene's from thinking you're a threat and providing a little bit of stress via plant toxins. Eat raw food, because that preserves those phytochemical toxins. Eat less, because that way you convince your genes you're not competing with your offspring. Calorie restriction is associated with longevity, at least in animal models, from the citations he provides. One thing that is slightly alarming here is that calorie restriction increases cortisol, and that's one mechanism that can increase fasting blood glucose levels. Which might be fine, if you're not insulin resistant.

To wrap up, I'm convinced that our genes matter and that our environment (including what we eat) influences which genes are on or off. But I was convinced of that before. I agree that programmed death is a valid hypothesis, but I don't see any evidence for the "competing with my offspring" as a trigger for killer genes. I agree that lots of low calorie vegies are good, but not sure about being able to get the majority of my protein from them without growing a gorilla sized colon. Eating less, check, as long as I'm not hungry. Back to the library with you, Gundry. 

[1] Steven R. Gundry. 2008. Dr. Gundry's diet evolution: turn off the genes that are killing you -- and your waistline -- and drop the weight for good. Crown Publishers, New York.

[2] Joshua Mittledorf and Andre Martins. 2014. Programmed Life Span in the Context of Evolvability. The American Naturalist, 184:289-302.

*But, but, gorillas have colons more than twice the size of a human which is why they can survive on calorie sparse food!

Monday, October 27, 2014

Blood sugar targets

or how low should I go? How high is too high? Is having targets even a good idea for Adaptive Management?

Friday, October 24, 2014

Testing hypotheses for high morning Blood Glucose

I've focused more on measurements of Hemoglobin A1c than blood sugar, as that gives a time average indication. Even so, my A1c numbers, while good for a diabetic at between 5% and 5.4%, are not great by Steve's targets. A value of 5.4% corresponds to an average BG of 108 mg/dL. Going the other way, having an average BG less than 100 mg/dL means an A1c of 5.1. The fact that my averages are much higher than my fasting levels during the day suggests that my post-meal levels, and early morning levels, are very much higher than I'd like. I've got a series of experiments underway looking at BG after eating, but what about overnight highs in Blood Glucose?

So I think this is a perfect setup for adaptive management.

There are two hypotheses for high morning blood glucose in a Type 2 diabetic. The "hormone hypothesis" posits that an increase in growth hormone production in the middle of the night triggers a cascade of events leading to the release of glucose from the liver. Presumably this is to set one up for a vigorous start to the day. In a healthy person this extra glucose triggers an insulin response, and blood glucose remains steady. In a person with insulin resistance (like me), the insulin response is not effective, and blood glucose rises. The second hypothesis is "The Somogyi effect", where low blood glucose values overnight trigger the the release of glucose from the liver to avoid too low blood sugar values. That also involves hormones, but they are activated for different reasons.

Which of these hypotheses is true does matter; under the 2nd hypothesis I should be able to avoid high morning sugar by making sure my overnight sugar levels don't drop too low. I could eat a snack before going to bed, for example. Under the hormone hypothesis there's not much I can do except to try harder to ameliorate the insulin resistance. I could start taking Metformin, which acts in two ways, first by suppressing liver production of glucose and second by reducing insulin resistance. Both of those actions should act directly on the dawn phenomenon.

I can also learn which of these hypotheses is true by monitoring my blood sugar. Ideally, I'd measure my blood sugar every hour or so all night ... hmm. That sounds like a sucky experiment. According to this article, I can get away with just measuring at bedtime, 3 a.m., and the morning.

So I tried it for 2 nights, and I think I've discovered a 3rd hypothesis, the "Drew Tyre Effect"! On the first night, 9pm 101, 3am 116, and 6am 122! OK, that's consistent with the hormone hypothesis, but maybe I missed a drop in BG between 9pm and 3am. So last night I took the middle measurement earlier, and 9:20pm 112, 12:25am 110, and 6:40am 116. My BG is high all night. 

So, a bit of reading on what stimulates Cortisol levels is warranted. As it happens, sleep deprivation stimulates cortisol production, so this could be an instance in which the observation of a process affects the process itself. I need one of those continuous glucose monitors!

Here's another hypothesis: I've been taking a fish oil supplement for the past 3 months after learning that I was low in tissue levels of EPA and DHA. However, it turns out that fish oil supplements can blunt insulin response and increase resting glucose levels. So looks like I have to wait 18 weeks and then try this experiment again.

And another hypothesis! I'm swimming in the damn things ... It turns out that taking a statin can interfere with blood glucose control. I starting taking 10 mg of Crestor every evening about the same time I started taking fish oil. Well, the results of my latest bloodwork will be very interesting. 

It's harder to be healthy when you're poor

I'm not poor. But I came across a very illustrative example of how eating real food is more expensive this week.

Wednesday, October 22, 2014

Walkabout Wednesday: peak meat, what math you should know, and Tiny Data.

The USA hit peak meat in 2004, Bryan McGill laid out a perfect math ecology curriculum, and Tiny Data should be all the rage in the near future.

Monday, October 20, 2014

Famine & the evolution of diet

One cornerstone assumption of a lot of diet theories based on emulating hunter-gatherer diets is that famine was a constant companion. Predictable seasonal famines occur in all environments, the dry season in the tropics and winter in the high latitudes. The just-so story goes like this: in a predictable feast-famine environment "thrifty genes" that take advantage of feast times to store fat in preparation for the lean times would be selected for. And when you take a population that has these thrifty genes and put them in a constant "feast" environment, they get fat and develop diabetes. Like the Pima Indians in North America*, and the Tokelau people in the south pacific. To avoid triggering these "thrifty genes", paleo diet proponents suggest avoiding the products of agriculture, both modern and historical.

Colette Berbesque and co-authors [1] decided to test the frequent famine assumption by looking at anthropological evidence of famine in a fairly sophisticated way. In particular, they controlled for the effects of habitat richness by only comparing hunter gatherers from warm climates (Effective Temperature > 13C) with agriculturalists, and by using a metric of habitat productivity (a linear combination of Net Primary Productivity and Effective Temperature) as a covariate. From their abstract, they found:
... if we control for habitat quality, hunter–gatherers actually had significantly less—
not more—famine than other subsistence modes.
Digging into the details though, we find that this is true for some measures of famine but not others. For instance, on the variables Occurrence, Severity, Persistence, Recurrence, and Contingency of famine warm climate hunter gatherers do better (that is, less famine). In terms of short term or seasonal famine however, there is no difference between warm climate hunter gatherers and agricultural peoples. And that's important, because it is seasonal famine that would lead to the evolution of thrifty genes. And there was plenty of time for such genes to arise in paleolithic people before the advent of agriculture, so the fact that agricultural peoples are no better at avoiding seasonal famine simply means the selection for these genes wouldn't go away after agriculture.

Although Berbesque et al. paint their research as a critique of the "paleo diet" approach, it seems to me that it reinforces a key tenet: paleolithic peoples had it better than their agricultural neighbors. Hunter gatherers do significantly better on  "ordinary nutritional conditions and endemic starvation", have no difference in seasonal famine, and less long term and unpredictable famine. Huh.

[1] J. Colette Berbesque, Frank W. Marlowe, Peter Shaw and Peter Thompson. 2014. Hunter−gatherers have less famine than agriculturalists. Biology Letters 10, 20130853.

*The Pima aren't necessarily a good example, as they were agricultural before becoming "modern western". It is interesting to me that the National Institute of Diabetes and Digestive and Kidney disease article I linked to focuses on the increase in fat in the Pima diet, rather than the exchange of whole grains for refined carbohydrate.