Monday, March 7, 2011

Flu Season is Here

I've noticed everyone around me getting sick lately (I seem to have become mostly immune to colds and the flu in the last couple of years), so I took a look at Google Flu Trends. Lo and behold, the United States is currently near peak flu incidence for the 2010-2011 season. Here's a graph from Flu Trends. This year's trend is in dark blue:


Flu Trends also has data for individual US states and a number of other countries.

It's time to tighten up your diet and lifestyle if you want to avoid the flu this year. Personally, I feel that eating well, managing stress effectively, and taking 2,000 IU of vitamin D3 per day in winter have helped me avoid colds and the flu.

Thursday, March 3, 2011

Gluten-Free January Raffle Winners Selected!

Raffle winners have been selected and shirts are on their way. You know who you are. Thanks to everyone who participated and filled out the survey! For those who didn't, there's always next year.

Janine Jagger, Matt Lentzner and I are busy crunching the mountain of data we collected from the GFJ survey. We got 279 responses, which is remarkable for a survey of this nature.

Stay tuned for data!

Tuesday, March 1, 2011

Oltipraz

Oltipraz is a drug that was originally used to treat intestinal worms. It was later found to prevent a broad variety of cancers (1). This was attributed to its ability to upregulate cellular detoxification and repair mechanisms.

Researchers eventually discovered that oltipraz acts by activating Nrf2, the same transcription factor activated by ionizing radiation and polyphenols (2, 3, 4). Nrf2 activation mounts a broad cellular protective response that appears to reduce the risk of multiple health problems.

A recent paper in Diabetologia illustrates this (5). Investigators put mice on a long-term refined high-fat diet, with or without oltipraz. These carefully crafted diets are very unhealthy indeed, and when fed to rodents they rapidly induce fat gain and something that looks similar to human metabolic syndrome (insulin resistance, abdominal adiposity, blood lipid disturbances). Adding oltipraz to the diet prevented the fat gain, insulin resistance and inflammatory changes that occurred in the refined high-fat diet group.

The difference in fasting insulin was remarkable. The mice taking oltipraz had 1/7 the fasting insulin of the refined high-fat diet comparison group, and 1/3 the fasting insulin of the low-fat comparison group! Yet their glucose tolerance was normal, indicating that they were not low on insulin due to pancreatic damage. The low-fat diet they used in this study was also refined, which is why the two control groups (high-fat and low-fat) didn't diverge more in body fatness and other parameters. If they had used a group fed unrefined rodent chow as the comparator, the differences between groups would have been larger.

This shows that in addition to preventing cancer, Nrf2 activation can attenuate the metabolic damage caused by an unhealthy diet in rodents. Oltipraz illustrates the power of the cellular hormesis response. We can exploit this pathway naturally using polyphenols and other chemicals found in whole plant foods.

Thursday, February 24, 2011

Polyphenols, Hormesis and Disease: Part II

In the last post, I explained that the body treats polyphenols as potentially harmful foreign chemicals, or "xenobiotics". How can we reconcile this with the growing evidence that at least a subset of polyphenols have health benefits?

Clues from Ionizing Radiation

One of the more curious things that has been reported in the scientific literature is that although high-dose ionizing radiation (such as X-rays) is clearly harmful, leading to cancer, premature aging and other problems, under some conditions low-dose ionizing radiation can actually decrease cancer risk and increase resistance to other stressors (1, 2, 3, 4, 5). It does so by triggering a protective cellular response, increasing cellular defenses out of proportion to the minor threat posed by the radiation itself. The ability of mild stressors to increase stress resistance is called "hormesis." Exercise is a common example. I've written about this phenomenon in the past (6).

The Case of Resveratrol

Resveratrol is perhaps the most widely known polyphenol, available in supplement stores nationwide. It's seen a lot of hype, being hailed as a "calorie restriction mimetic" and the reason for the "French paradox."* But there is quite a large body of evidence suggesting that resveratrol functions in the same manner as low-dose ionizing radiation and other bioactive polyphenols: by acting as a mild toxin that triggers a hormetic response (7). Just as in the case of radiation, high doses of resveratrol are harmful rather than helpful. This has obvious implications for the supplementation of resveratrol and other polyphenols. A recent review article on polyphenols stated that while dietary polyphenols may be protective, "high-dose fortified foods or dietary supplements are of unproven efficacy and possibly harmful" (8).

The Cellular Response to Oxidants

Although it may not be obvious, radiation and polyphenols activate a cellular response that is similar in many ways. Both activate the transcription factor Nrf2, which activates genes that are involved in detoxification of chemicals and antioxidant defense**(9, 10, 11, 12). This is thought to be due to the fact that polyphenols, just like radiation, may temporarily increase the level of oxidative stress inside cells. Here's a quote from the polyphenol review article quoted above (13):
We have found that [polyphenols] are potentially far more than 'just antioxidants', but that they are probably insignificant players as 'conventional' antioxidants. They appear, under most circumstances, to be just the opposite, i.e. prooxidants, that nevertheless appear to contribute strongly to protection from oxidative stress by inducing cellular endogenous enzymic protective mechanisms. They appear to be able to regulate not only antioxidant gene transcription but also numerous aspects of intracellular signaling cascades involved in the regulation of cell growth, inflammation and many other processes.
It's worth noting that this is essentially the opposite of what you'll hear on the evening news, that polyphenols are direct antioxidants. The scientific cutting edge has largely discarded that hypothesis, but the mainstream has not yet caught on.

Nrf2 is one of the main pathways by which polyphenols increase stress resistance and antioxidant defenses, including the key cellular antioxidant glutathione (14). Nrf2 activity is correlated with longevity across species (15). Inducing Nrf2 activity via polyphenols or by other means substantially reduces the risk of common lifestyle disorders in animal models, including cardiovascular disease, diabetes and cancer (16, 17, 18), although Nrf2 isn't necessarily the only mechanism. The human evidence is broadly consistent with the studies in animals, although not as well developed.

One of the most interesting effects of hormesis is that exposure to one stressor can increase resistance to other stressors. For example, long-term consumption of high-polyphenol chocolate increases sunburn resistance in humans, implying that it induces a hormetic response in skin (19). Polyphenol-rich foods such as green tea reduce sunburn and skin cancer development in animals (20, 21).

Chris Masterjohn first introduced me to Nrf2 and the idea that polyphenols act through hormesis. Chris studies the effects of green tea on health, which seem to be mediated by polyphenols.

A Second Mechanism

There is a place in the body where polyphenols are concentrated enough to be direct antioxidants: in the digestive tract after consuming polyphenol-rich foods. Digestion is a chemically harsh process that readily oxidizes ingested substances such as polyunsaturated fats (22). Oxidized fat is neither healthy when it's formed in the deep fryer, nor when it's formed in the digestive tract (23, 24). Eating polyphenol-rich foods effectively prevents these fats from being oxidized during digestion (25). One consequence of this appears to be better absorption and assimilation of the exceptionally fragile omega-3 polyunsaturated fatty acids (26).

What does it all Mean?

I think that overall, the evidence suggests that polyphenol-rich foods are healthy in moderation, and eating them on a regular basis is generally a good idea. Certain other plant chemicals, such as suforaphane found in cruciferous vegetables, and allicin found in garlic, exhibit similar effects and may also act by hormesis (27). Some of the best-studied polyphenol-rich foods are tea (particularly green tea), blueberries, extra-virgin olive oil, red wine, citrus fruits, hibiscus tea, soy, dark chocolate, coffee, turmeric and other herbs and spices, and a number of traditional medicinal herbs. A good rule of thumb is to "eat the rainbow", choosing foods with a variety of colors.

Supplementing with polyphenols and other plant chemicals in amounts that would not be achievable by eating food is probably not a good idea.


* The "paradox" whereby the French eat a diet rich in saturated fat, yet have a low heart attack risk compared to other affluent Western nations.

** Genes containing an antioxidant response element (ARE) in the promoter region. ARE is also sometimes called the electrophile response element (EpRE).

Sunday, February 13, 2011

Polyphenols, Hormesis and Disease: Part I

What are Polyphenols?

Polyphenols are a diverse class of molecules containing multiple phenol rings. They are synthesized in large amounts by plants, certain fungi and a few animals, and serve many purposes, including defense against predators/infections, defense against sunlight damage and chemical oxidation, and coloration. The color of many fruits and vegetables, such as blueberries, eggplants, red potatoes and apples comes from polyphenols. Some familiar classes of polyphenols in the diet-health literature are flavonoids, isoflavonoids, anthocyanidins, and lignins.

The Case Against Polyphenols


Mainstream diet-health authorities seem pretty well convinced that dietary polyphenols are an important part of good health, due to their supposed antioxidant properties. In the past, I've been critical of the hypothesis. There are several reasons for it:
  1. Polyphenols are often, but not always, defensive compounds that interfere with digestive processes, which is why they often taste bitter and/or astringent. Plant-eating animals including humans have evolved defensive strategies against polyphenol-rich foods, such as polyphenol-binding proteins in saliva (1).
  2. Ingested polyphenols are poorly absorbed (2). The concentration in blood is low, and the concentration inside cells is probably considerably lower*. In contrast, essential antioxidant nutrients such as vitamins E and C are efficiently absorbed rather than excluded from the circulation.
  3. Polyphenols that manage to cross the gut barrier are rapidly degraded by the liver, just like a variety of other foreign molecules, again suggesting that the body doesn't want them hanging around (2).
  4. The most visible hypothesis of how polyphenols influence health is the idea that they are antioxidants, protecting against the ravages of reactive oxygen species. While many polyphenols are effective antioxidants at high concentrations in a test tube, I don't find it very plausible that the low and transient blood concentration of polyphenols achieved by eating polyphenol-rich foods makes a meaningful contribution to that person's overall antioxidant status, when compared to the relatively high concentrations of other antioxidants in blood (uric acid; vitamins C, E; ubiquinone) and particularly inside cells (SOD1/2, catalase, glutathione reductase, thioredoxin reductase, paraoxonase 1, etc.).
  5. There are a number of studies showing that the antioxidant capacity of the blood increases after eating polyphenol-rich foods. These are often confounded by the fact that fructose (in fruit and some vegetables) and caffeine (in tea and coffee) can increase the blood level of uric acid, the blood's main water-soluble antioxidant. Drinking sugar water has the same effect (2).
  6. Rodent studies showing that polyphenols improve health typically use massive doses that exceed what a person could consume eating food, and do not account for the possibility that the rodents may have been calorie restricted because their food tastes horrible.
The main point is that the body does not seem to "want" polyphenols in the circulation at any appreciable level, and therefore it gets rid of them pronto. Why? I think it's because the diversity and chemical structure of polyphenols makes them potentially bioactive-- they have a high probability of altering signaling pathways and enzyme activity, in the same manner as pharmaceutical drugs. It would not be a very smart evolutionary strategy to let plants (that often don't want you eating them) take the reins on your enzyme activity and signaling pathways. Also, at high enough concentrations polyphenols can be pro-oxidants, promoting excess production of free radicals, although the biological relevance of that may be questionable due to the concentrations required.

A Reappraisal

After reading more about polyphenols, and coming to understand that the prevailing hypothesis of why they work makes no sense, I decided that the whole thing is probably bunk: at best, specific polyphenols are protective in rodents at unnaturally high doses due to some drug-like effect. But-- I kept my finger on the pulse of the field just in case, and I began to notice that more sophisticated studies were emerging almost weekly that seemed to confirm that realistic amounts of certain polyphenol-rich foods (not just massive quantities of polyphenol extract) have protective effects against a variety of health problems. There are many such studies, and I won't attempt to review them comprehensively, but here are a few I've come across:
  • Dr. David Grassi and colleagues showed that polyphenol-rich chocolate lowers blood pressure, improves insulin sensitivity and lowers LDL cholesterol in hypertensive and insulin resistant volunteers when compared with white chocolate (3). Although dark chocolate is also probably richer in magnesium, copper and other nutrients than white chocolate, the study is still intriguing.
  • Dr. Christine Morand and colleagues showed that drinking orange juice every day lowers blood pressure and increases vascular reactivity in overweight volunteers, an effect that they were able to specifically attribute to the polyphenol hesperidin (4).
  • Dr. F. Natella and colleagues showed that red wine prevents the increase in oxidized blood lipids (fats) that occurs after consuming a meal high in oxidized and potentially oxidizable fats (5).
  • Several studies have shown that hibiscus tea lowers blood pressure in people with hypertension when consumed regularly (6, 7, 8). It also happens to be delicious.
  • Dr. Arpita Basu and colleagues showed that blueberries lower blood pressure and oxidized LDL in men and women with metabolic syndrome (9).
  • Animal studies have generally shown similar results. Dr. Xianli Wu and colleagues showed the blueberries potently inhibit atherosclerosis (hardening and thickening of the arteries that can lead to a heart attack) in a susceptible strain of mice (10). This effect was associated with a higher expression level of antioxidant enzymes in the vessel walls and other tissues.
Wait a minute... let's rewind. Eating blueberries caused mice to increase the expression level of their own antioxidant enzymes?? Why would that happen if blueberry polyphenols were themselves having a direct antioxidant effect? One would expect the opposite reaction if they were. What's going on here?

In the face of this accumulating evidence, I've had to reconsider my position on polyphenols. In the process, and through conversations with knowledgeable researchers in the polyphenol field, I encountered a different hypothesis that puts the puzzle pieces together nicely.


* Serum levels briefly enter the mid nM to low uM range, depending on the food (2). Compare that with the main serum antioxidants: ~200 uM for uric acid, ~100 uM for vitamin C, ~30 uM for vitamin E.

Thursday, February 10, 2011

My Gluten-Free January

I've been avoiding most gluten, particularly wheat, for over a year now. I never had obvious symptoms that I could clearly link to eating wheat, although I had my suspicions. I've made many changes to my diet over the last decade, and I feel much better than I did ten years ago, but it's hard to disentangle all the factors. I don't think I ever went an entire month without eating any gluten at all before this January. After posting Matt Lentzner's challenge to go gluten-free this January, I felt obligated to do it myself, so I signed up!

I succeeded in avoiding all gluten for the month of January, even though it was a pain at times. I felt good before January, and didn't start with any health or body weight problems, so there wasn't much to improve. I also felt good while strictly avoiding gluten this January, perhaps a little better than usual but it's hard to say.

At the end of the month, I did a blinded wheat challenge using the method I described in a previous post, which uses gluten-free bread as the placebo (1). I recorded my blood sugar at 30 minute intervals after eating the bread, and recorded how I felt physically and emotionally for three days after each challenge.

The result? I think the bread gave me gas, but that's about it. I'm not even positive that was due to the wheat. My energy level was good, and I didn't experience any digestive pain or changes in transit time. There was no significant difference in my blood glucose response between the bread and the gluten-free bread.

I decided that I didn't have any symptoms, so I celebrated by having a porter (1) with friends a few nights later. I slept poorly and woke up with mild digestive discomfort and gas. Then I ate wheat later in the week and slept poorly and got gas again. Hmmm...

Some people might say that the body adapts to any food, and wheat is no different. Go without it for a while, and the body has a tough time digesting it. But I can go for weeks without eating a potato, a chicken thigh or broccoli, and all will digest just fine when I eat them again.

I'm pretty sure I don't have a severe reaction to gluten. I think I'm going to stick with my mostly gluten-free habits, and eat it occasionally when I'm offered food in social situations.

Did anyone else do a blinded wheat challenge? Describe it in the comments!

Saturday, February 5, 2011

Assorted Thoughts About the 2010 Dietary Guidelines

In the past week, I've been rooting through the USDA's 2010 Dietary Guidelines (1). Here are a few of my thoughts.

Positive

One of the things I've been enjoying recently is watching health authorities shift away from a nutrient-oriented philosophy in favor of a more food-oriented philosophy. For example, I recently read a nice editorial by Drs. Dariush Mozaffarian and David S. Ludwig (not associated with the USDA) that encapsulates this (2). Here's a quote:
Nutritional science has advanced rapidly, and the evidence now demonstrates the major limitations of nutrient-based metrics for prevention of chronic disease. The proportion of total energy from fat appears largely unrelated to risk of cardiovascular disease, cancer, diabetes, or obesity. Saturated fat—targeted by nearly all nutrition-related professional organizations and governmental agencies—has little relation to heart disease within most prevailing dietary patterns. Typical recommendations to consume at least half of total energy as carbohydrate, a nutrient for which humans have no absolute requirement, conflate foods with widely divergent physiologic effects (eg, brown rice, white bread, apples). Foods are grouped based on protein content (chicken, fish, beans, nuts) despite demonstrably different health effects. With few exceptions (eg, omega-3 fats, trans fat, salt), individual compounds in isolation have small effects on chronic diseases. Thus, little of the information found on food labels’ “nutrition facts” panels provides useful guidance for selecting healthier foods to prevent chronic disease.

In contrast with discrete nutrients, specific foods and dietary patterns substantially affect chronic disease risk, as shown by controlled trials of risk factors and prospective cohorts of disease end points

Although this approach may seem radical, it actually represents a return to more traditional, time-tested ways of eating. Healthier food-based dietary patterns have existed for generations among some populations.
Tell it! Although he doesn't use the word nutritionism, that's basically what he's arguing against. Dr. Mozaffarian seems to represent the less reductionist school of nutrition, which is a more informed version of what nutrition pioneers such as Sir Edward Mellanby, Dr. May Mellanby, Dr. Weston Price and Sir Robert McCarrison advocated.

Although the 2010 guidelines are too focused on nutrients for my taste, they do spend some time talking about food groups and eating patterns, for example, recommending an increase in the consumption of vegetables, fruit, whole grains and seafood. They also recommend Mediterranean and plant-focused eating patterns. Although I don't think their recommendations quite hit the mark, they do reflect a shift in thinking.

Another thing I enjoyed about the Guidelines is the table on page 12 of chapter 2, which shows just how messed up the average American diet is. The number one source of calories in all age groups is "grain-based desserts". The next five in adults are yeast breads, chicken dishes, soda/sports drinks, alcohol and pizza. To see typical American food habits presented like this just blows me away. They call this the "obesogenic environment"; the idea that we're surrounded by tasty but unhealthy food and situations that favor the consumption of it. I agree.

The Guidelines also contain a surprisingly accurate one-sentence review of the glycemic index literature:
Strong evidence shows that glycemic index and/or glycemic load are not associated with body weight; thus, it is not necessary to consider these measures when selecting carbohydrate foods and beverages for weight management.
Negative

The first problem is the creation of the category "solid fats and added sugars", abbreviated SoFAS. With the creation of this term, they lump pastured butter together with Crisco and Red Hots. If they've been hiding the evidence that pastured butter, virgin coconut oil or red palm oil contribute to heart disease, I'd like to see it so I can stop eating them!

Another problem is their list of recommendations to curb the obesity epidemic. They say:
The current high rates of overweight and obesity among virtually all subgroups of the population in the United States demonstrate that many Americans are in calorie imbalance—that is, they consume more calories than they expend. To curb the obesity epidemic and improve their health, Americans need to make significant efforts to decrease the total number of calories they consume from foods and beverages and increase calorie expenditure through physical activity.
Looks like we have Sherlock Holmes on the case. Now that we have this information, all we have to do is tell overweight people to eat less and they'll be lean again! What's that, they already know and it's not working?? Someone should tell the USDA.

Jokes aside, I do think energy balance is a huge issue, perhaps even the central issue in chronic disease risk in affluent nations. The basic problem is that Americans are eating more calories than is optimal, and they have a very hard time stopping. It's not because they have less willpower than their stoic ancestors, it's because their bodies have decided that overweight/obesity is the new lean, and they defend that higher level of fat mass against changes. Simply telling an overweight person to eat fewer calories, without changing the dietary context, is not very effective in the long term, due to compensatory mechanisms including hunger and increased metabolic efficiency (fewer calories burned for the same muscular exertion).

What does the USDA recommend to lose fat or maintain leanness?
  • Count calories. Doesn't work for most people, although I acknowledge that it is physically possible to lose fat (and lean mass) by restricting calories.
  • Reduce sweetened beverages. Thumbs up.
  • Serve smaller portions. As far as I know, this rests exclusively on very short-term studies that showed that food consumed at a single meal or three is reduced if portion size is smaller. I guess it can't hurt to try it, but I'm not convinced it will have any effect on long-term body fatness. I think restaurant portion sizes have probably increased because people eat more, rather than the other way around, although both could be true.
  • Eat foods that are less calorie dense. I think vegetables are healthy, but is it because they're less calorie-dense? Why is dietary fat intake generally not associated with obesity if it's the most calorie-dense substance? Why do many people lose body fat eating energy-dense low-carbohydrate diets? Not convinced, but I'm feeling open minded about this one.
  • Exercise more and watch less TV. Exercise is good. But don't let it make you hungry, because then you'll eat more!
Overall, I think their recommendations for fat loss are not very satisfying because they don't address the core reasons Americans aren't in energy balance. Eliminating sweetened beverages and exercising are the most solid advice they offered in my opinion. The rest strikes me as wishy-washy advice that's offered because they have to say something.

At one point, they talk about changes in the US diet that have corresponded with the obesity epidemic:
Average daily calories available per person in the marketplace increased approximately 600 calories, with the greatest increases in the availability of added fats and oils, grains, milk and milk products, and caloric sweeteners.
Let me edit that so it's more complete:
Average daily calories available per person in the marketplace increased approximately 600 calories per day, 250 calories of which were actually consumed (USDA and NHANES). Added fats increased, due to a large increase in seed oil intake, but total fat intake remained approximately the same because of a roughly equal decrease in fatty meat and whole milk consumption (USDA and NHANES). Grain intake, predominantly wheat, increased, as did the consumption of refined sweeteners, predominantly high-fructose corn syrup (USDA).
It reads a bit differently once you have a little more information, doesn't it? Animal fat intake declined considerably, and was replaced by seed oils, in parallel with the obesity and diabetes epidemics. Maybe it contributed, maybe it didn't, but why not just be forthright about it? People appreciate honesty.

Conclusion

Although the 2010 USDA Dietary Guidelines show some promising trends, and contain some good information, I hope you can find a better source than the USDA for your nutrition advice.

Monday, January 31, 2011

Gluten-free January Participants: Take the Survey!

Matt Lentzner, Janine Jagger and I have designed a survey for participants of Gluten-free January, using the online application StatCrunch. Janine is an epidemiologist who studies healthcare worker safety at the University of Virginia; she has experience designing surveys for data collection so we're glad to have her on board. The survey will allow us to systematically gather and analyze data on the results of Gluten-free January. It will be 100 percent anonymous-- none of your answers will be connected to your identity in any way.

This survey has the potential to be really informative, but it will only work if you respond! The more people who take the survey, the more informative it will be, even if you didn't avoid gluten for a single day. If not very many people respond, it will be highly susceptible to "selection bias", where perhaps the only people who responded are people who improved the most, skewing the results.

Matt will be sending the survey out to everyone on his mailing list. Please complete it, even if you didn't end up avoiding gluten at all! There's no shame in it. The survey has responses built in for people who didn't avoid gluten. Your survey will still be useful!

We have potential data from over 500 people. After we crunch the numbers, I'll share them on the blog.

Thursday, January 27, 2011

The Diabetes Epidemic

The CDC just released its latest estimate of diabetes prevalence in the US (1):
Diabetes affects 8.3 percent of Americans of all ages, and 11.3 percent of adults aged 20 and older, according to the National Diabetes Fact Sheet for 2011. About 27 percent of those with diabetes—7 million Americans—do not know they have the disease. Prediabetes affects 35 percent of adults aged 20 and older.
Wow-- this is a massive problem. The prevalence of diabetes has been increasing over time, due to more people developing the disorder, improvements in diabetes care leading to longer survival time, and changes in the way diabetes is diagnosed. Here's a graph I put together based on CDC data, showing the trend of diabetes prevalence (percent) from 1980 to 2008 in different age categories (2):


These data are self-reported, and do not correct for differences in diagnosis methods, so they should be viewed with caution-- but they still serve to illustrate the trend. There was an increase in diabetes incidence that began in the early 1990s. More than 90 percent of cases are type 2 diabetics. Disturbingly, the trend does not show any signs of slowing.

The diabetes epidemic has followed on the heels of the obesity epidemic with 10-20 years of lag time. Excess body fat is the number one risk factor for diabetes*. As far as I can tell, type 2 diabetes is caused by insulin resistance, which is probably due to energy intake exceeding energy needs (overnutrition), causing a state of cellular insulin resistance as a defense mechanism to protect against the damaging effects of too much glucose and fatty acids (3). In addition, type 2 diabetes requires a predisposition that prevents the pancreatic beta cells from keeping up with the greatly increased insulin needs of an insulin resistant person**. Both factors are required, and not all insulin resistant people will develop diabetes as some people's beta cells are able to compensate by hypersecreting insulin.

Why does energy intake exceed energy needs in modern America and in most affluent countries? Why has the typical person's calorie intake increased by 250 calories per day since 1970 (4)? I believe it's because the fat mass "setpoint" has been increased, typically but not always by industrial food. I've been developing some new thoughts on this lately, and potentially new solutions, which I'll reveal when they're ready.


* In other words, it's the best predictor of future diabetes risk.

** Most of the common gene variants (of known function) linked with type 2 diabetes are thought to impact beta cell function (5).

Two Wheat Challenge Ideas from Commenters

Some people have remarked that the blinded challenge method I posted is cumbersome.

Reader "Me" suggested:
You can buy wheat gluten in a grocery store. Why not simply have your friend add some wheat gluten to your normal protein shake.
Reader David suggested:
They sell empty gelatin capsules with carob content to opacify them. Why not fill a few capsules with whole wheat flour, and then a whole bunch with rice starch or other placebo. For two weeks take a set of, say, three capsules every day, with the set of wheat capsules in line to be taken on a random day selected by your friend. This would further reduce the chances that you would see through the blind, and it prevent the risk of not being able to choke the "smoothie" down. It would also keep it to wheat and nothing but wheat (except for the placebo starch).
The reason I chose the method in the last post is that it directly tests wheat in a form that a person would be likely to eat: bread. The limitation of the gluten shake method is that it would miss a sensitivity to components in wheat other than gluten. The limitation of the pill method is that raw flour is difficult to digest, so it would be difficult to extrapolate a sensitivity to cooked flour foods. You might be able to get around that by filling the pills with powdered bread crumbs. Those are two alternative ideas to consider if the one I posted seems too involved.

Monday, January 24, 2011

Blinded Wheat Challenge

Self-experimentation can be an effective way to improve one's health*. One of the problems with diet self-experimentation is that it's difficult to know which changes are the direct result of eating a food, and which are the result of preconceived ideas about a food. For example, are you more likely to notice the fact that you're grumpy after drinking milk if you think milk makes people grumpy? Maybe you're grumpy every other day regardless of diet? Placebo effects and conscious/unconscious bias can lead us to erroneous conclusions.

The beauty of the scientific method is that it offers us effective tools to minimize this kind of bias. This is probably its main advantage over more subjective forms of inquiry**. One of the most effective tools in the scientific method's toolbox is a control. This is a measurement that's used to establish a baseline for comparison with the intervention, which is what you're interested in. Without a control measurement, the intervention measurement is typically meaningless. For example, if we give 100 people pills that cure belly button lint, we have to give a different group placebo (sugar) pills. Only the comparison between drug and placebo groups can tell us if the drug worked, because maybe the changing seasons, regular doctor's visits, or having your belly button examined once a week affects the likelihood of lint.

Another tool is called blinding. This is where the patient, and often the doctor and investigators, don't know which pills are placebo and which are drug. This minimizes bias on the part of the patient, and sometimes the doctor and investigators. If the patient knew he were receiving drug rather than placebo, that could influence the outcome. Likewise, investigators who aren't blinded while they're collecting data can unconsciously (or consciously) influence it.

Back to diet. I want to know if I react to wheat. I've been gluten-free for about a month. But if I eat a slice of bread, how can I be sure I'm not experiencing symptoms because I think I should? How about blinding and a non-gluten control?

Procedure for a Blinded Wheat Challenge

1. Find a friend who can help you.

2. Buy a loaf of wheat bread and a loaf of gluten-free bread.

3. Have your friend choose one of the loaves without telling you which he/she chose.

4. Have your friend take 1-3 slices, blend them with water in a blender until smooth. This is to eliminate differences in consistency that could allow you to determine what you're eating. Don't watch your friend do this-- you might recognize the loaf.

5. Pinch your nose and drink the "bread smoothie" (yum!). This is so that you can't identify the bread by taste. Rinse your mouth with water before releasing your nose. Record how you feel in the next few hours and days.

6. Wait a week. This is called a "washout period". Repeat the experiment with the second loaf, attempting to keep everything else about the experiment as similar as possible.

7. Compare how you felt each time. Have your friend "unblind" you by telling you which bread you ate on each day. If you experienced symptoms during the wheat challenge but not the control challenge, you may be sensitive to wheat.

If you want to take this to the next level of scientific rigor, repeat the procedure several times to see if the result is consistent. The larger the effect, the fewer times you need to repeat it to be confident in the result.


* Although it can also be disastrous. People who get into the most trouble are "extreme thinkers" who have a tendency to take an idea too far, e.g., avoid all animal foods, avoid all carbohydrate, avoid all fat, run two marathons a week, etc.

** More subjective forms of inquiry have their own advantages.

Thursday, January 20, 2011

Eating Wheat Gluten Causes Symptoms in Some People Who Don't Have Celiac Disease

Irritable bowel syndrome (IBS) is a condition characterized by the frequent occurrence of abdominal pain, diarrhea, constipation, bloating and/or gas. If that sounds like an extremely broad description, that's because it is. The word "syndrome" is medicalese for "we don't know what causes it." IBS seems to be a catch-all for various persistent digestive problems that aren't defined as separate disorders, and it has a very high prevalence: as high as 14 percent of people in the US, although the estimates depend on what diagnostic criteria are used (1). It can be brought on or exacerbated by several different types of stressors, including emotional stress and infection.

Maelán Fontes Villalba at Lund University recently forwarded me an interesting new paper in the American Journal of Gastroenterology (2). Dr. Jessica R. Biesiekierski and colleagues recruited 34 IBS patients who did not have celiac disease, but who felt they had benefited from going gluten-free in their daily lives*. All patients continued on their pre-study gluten-free diet, however, all participants were provided with two slices of gluten-free bread and one gluten-free muffin per day. The investigators added isolated wheat gluten to the bread and muffins of half the study group.

During the six weeks of the intervention, patients receiving the gluten-free food fared considerably better on nearly every symptom of IBS measured. The most striking difference was in tiredness-- the gluten-free group was much less tired on average than the gluten group. Interestingly, they found that a negative reaction to gluten was not necessarily accompanied by the presence of anti-gluten antibodies in the blood, which is a test often used to diagnose gluten sensitivity.

Here's what I take away from this study:
  1. Wheat gluten can cause symptoms in susceptible people who do not have celiac disease.
  2. A lack of circulating antibodies against gluten does not necessarily indicate a lack of gluten sensitivity.
  3. People with mysterious digestive problems may want to try avoiding gluten for a while to see if it improves their symptoms**.
  4. People with mysterious fatigue may want to try avoiding gluten.
A previous study in 1981 showed that feeding volunteers a large dose of gluten every day for 6 weeks caused adverse gastrointestinal effects, including inflammatory changes, in relatives of people with celiac disease, who did not themselves have celiac (3). Together, these two studies are the most solid evidence that gluten can be damaging in people without celiac disease, a topic that has not received much interest in the biomedical research community.

I don't expect everyone to benefit from avoiding gluten. But for those who are really sensitive, it can make a huge difference. Digestive, autoimmune and neurological disorders associate most strongly with gluten sensitivity. Avoiding gluten can be a fruitful thing to try in cases of mysterious chronic illness. We're two-thirds of the way through Gluten-Free January. I've been fastidiously avoiding gluten, as annoying as it's been at times***. Has anyone noticed a change in their health?


* 56% of volunteers carried HLA-DQ2 or DQ8 alleles, which is slightly higher than the general population. Nearly all people with celiac disease carry one of these two alleles. 28% of volunteers were positive for anti-gliadin IgA, which is higher than the general population.

** Some people feel they are reacting to the fructans in wheat, rather than the gluten. If a modest amount of onion causes the same symptoms as eating wheat, then that may be true. If not, then it's probably the gluten.

*** I'm usually about 95% gluten-free anyway. But when I want a real beer, I want one brewed with barley. And when I want Thai food or sushi, I don't worry about a little bit of wheat in the soy sauce. If a friend makes me food with gluten in it, I'll eat it and enjoy it. This month I'm 100% gluten-free though, because I can't in good conscience encourage my blog readership to try it if I'm not doing it myself. At the end of the month, I'm going to do a blinded gluten challenge (with a gluten-free control challenge) to see once and for all if I react to it. Stay tuned for more on that.

Thursday, January 13, 2011

Does Dietary Saturated Fat Increase Blood Cholesterol? An Informal Review of Observational Studies

The diet-heart hypothesis states three things:
  1. Dietary saturated fat increases blood cholesterol
  2. Elevated blood cholesterol increases the risk of having a heart attack
  3. Therefore, dietary saturated fat increases the risk of having a heart attack
To evaluate the second contention, investigators have examined the relationship between blood cholesterol and heart attack risk. Many studies including MRFIT have shown that the two are related (1):

The relationship becomes much more complex when you consider lipoprotein subtypes, density and oxidation level, among other factors, but at the very least there is an association between habitual blood cholesterol level and heart attack risk. This is what you would want to see if your hypothesis states that high blood cholesterol causes heart attacks.

Now let's turn to the first contention, the hypothesis that dietary saturated fat increases serum cholesterol. This idea is so deeply ingrained in the scientific literature that many authors don't even bother providing references for it anymore. When references are provided, they nearly always point to the same type of study: short-term controlled diet trials, in which volunteers are fed different fats for 2-13 weeks and their blood cholesterol measured (2)*. These are the studies on which the diet-heart hypothesis was built.

But now we have a problem. Nearly every high-quality (prospective) observational study ever conducted found that saturated fat intake is not associated with heart attack risk (3). So if saturated fat increases blood cholesterol, and higher blood cholesterol is associated with an increased risk of having a heart attack, then why don't people who eat more saturated fat have more heart attacks?

I'll begin to answer that question with another question: why do researchers almost never cite observational studies to support the idea that dietary saturated fat increases blood cholesterol? Surely if the hypothesis is correct, then people who habitually eat a lot of saturated fat should have high cholesterol, right? One reason may be that in most instances, when researchers have looked for a relationship between saturated fat intake and blood cholesterol, they haven't found one. Those findings have essentially been ignored, but let's have a look...

The Studies

It's difficult to do a complete accounting of these studies, but I've done my best to round them up. I can't claim this post is comprehensive, but I doubt I missed very many, and I certainly didn't exclude any that I came across. If you know of any I missed, please add them to the comments.

The earliest and perhaps most interesting study I found was published in the British Medical Journal in 1963 and is titled "Diet and Plasma Cholesterol in 99 Bank Men" (4). Investigators asked volunteers to weigh all food consumed at home for 1-2 weeks, and describe in detail all food consumed away from home. Compliance was good. This dietary accounting method was much more thorough than in most observational studies today**. Animal fat intake ranged from 55 to 173 grams per day, and blood cholesterol ranged from 154 to 324 mg/dL, yet there was no relationship whatsoever between the two. I'm looking at a graph of animal fat intake vs. blood cholesterol as I write this, and it looks like someone shot it with a shotgun at 50 yards. They twisted the data every which way, but were never able to squeeze even a hint of an association out of it:
Making the most out of the data in other ways- for example, by analysis of the men very stable in their diets, or in whom weighing of food intake was maximal, or where blood was taken close to the diet [measurement]- did not increase the correlation. Because the correlation coefficient is almost as often negative as positive, moreover, what is being discussed mostly is the absence of association, not merely association that is unexpectedly small.
The next study to discuss is the 1976 Tecumseh study (5). This was a large cardiovascular observational study conducted in Tecumseh, Michigan, which is often used as the basis for comparison for other cardiovascular studies in the literature. Using the 24 hour dietary recall method, including an analysis of saturated fat, the investigators found that:
Cholesterol and triglyceride levels were unrelated to quality, quantity, or proportions of fat, carbohydrate or protein consumed in the 24-hr recall period.
They also noted that the result was consistent with what had been reported in other previously published studies, including the Evans county study (6), the massive Israel Ischemic Heart Disease Study (7) and the Framingham study. One of the longest-running, most comprehensive and most highly cited observational studies, the Framingham study was organized by Harvard investigators and continues to this day. When investigators analyzed the relationship between saturated fat intake, serum cholesterol and heart attack risk, they were so disappointed that they never formally published the results. We know from multiple sources that they found no significant relationship between saturated fat intake and blood cholesterol or heart attack risk***.

The next study is the Bogalusa Heart Study, published in 1978, which studied the diet and health of 10 year old American children (8). This study found an association by one statistical method, and none by a second method****. They found that the dietary factors they analyzed explained no more than 4% of the variation in blood cholesterol. Overall, I think this study lends little or no support to the hypothesis.

Next is the Western Electric study, published in 1981 (9). This study found an association between saturated fat intake and blood cholesterol in middle-aged men in Chicago. However, the correlation was small, and there was no association between saturated fat intake and heart attack deaths. They cited two other studies that found an association between dietary saturated fat and blood cholesterol (and did not cite any of the numerous studies that found no association). One was a very small study conducted in young men doing research in Antarctica, which did not measure saturated fat but found an association between total fat intake and blood cholesterol (10). The other studied Japanese (Nagasaki and Hiroshima) and Japanese Americans in Japan, Hawai'i and California respectively (11).

This study requires some discussion. Published in 1973, it found a correlation between saturated fat intake and blood cholesterol in Japan, Hawai'i but not in California. The strongest association was in Japan, where going from 5 to 75 g/day of saturated fat (a 15-fold change!) was associated with an increase in blood cholesterol from about 175 to 200 mg/dL. However, I don't think this study offers much support to the hypothesis upon closer examination. Food intake in Japan was collected by 24-hour recall in 1965-1967, when the diet was mostly white rice in some areas. The lower limit of saturated fat intake in Japan was 5g/day, 1/12th what was typically eaten in Hawai'i and California, and the Japanese average was 16g, with most people falling below 10g. That is an extraordinarily low saturated fat intake. I think a significant portion of the Japanese in this study, living in the war-ravaged cities of Nagasaki and Hiroshima, were over-reliant on white rice and perhaps bordering on malnourishment.

In Japanese-Americans living in Hawai'i, over a range of saturated fat intakes between 5 and 110 g/day, cholesterol went from 210 to 220 mg/dL. That was statistically significant but it's not exactly knocking my socks off, considering it's a 22-fold change in saturated fat intake. In California, going from 15 to 110 g/day of saturated fat (7.3-fold change) was not associated with a change in blood cholesterol. Blood cholesterol was 20-30 mg/dL lower in Japan than in Hawai'i or California at any given level of saturated fat intake (e.g., Japanese eating 30g per day vs. Hawai'ians eating 30g per day). I think it's probable that saturated fat is not the relevant factor here, or at least it's being trumped by other factors. An equally plausible explanation is that people in the very low range of saturated fat intake are the rural poor who eat an impoverished diet that differs in many ways from the diets at the upper end of the range.

The most recent study was the Health Professional Follow-up study, published in 1996 (12). This was a massive, well funded study that found no hint of a relationship between saturated fat intake and blood cholesterol.

Conclusion

Of all the studies I came across, only the Western Electric study found a clear association between habitual saturated fat intake and blood cholesterol, and even that association was weak. The Bogalusa Heart study and the Japanese study provided inconsistent evidence for a weak association. The other studies I cited, including the bank workers' study, the Tecumseh study, the Evans county study, the Israel Ischemic Heart study, the Framingham study and the Health Professionals Follow-up study, found no association between the two factors.

Overall, the literature does not offer much support for the idea that long term saturated fat intake has a significant effect on the concentration of blood cholesterol. If it's a factor at all, it must be rather weak, which is consistent with what has been observed in multiple non-human species (13). I think it's likely that the diet-heart hypothesis rests in part on an over-interpretation of short-term controlled feeding studies. I'd like to see a more open discussion of this in the scientific literature. In any case, these controlled studies have typically shown that saturated fat increases both LDL and HDL, so even if saturated fat did have a small long-term effect on blood cholesterol, as hinted at by some of the observational studies, its effect on heart attack risk would still be difficult to predict.

The Diet-heart Hypothesis: Stuck at the Starting Gate
Animal Models of Atherosclerosis: LDL


* As a side note, many of these studies were of poor quality, and were designed in ways that artificially inflated the effects of saturated fat on blood lipids. For example, using a run-in period high in linoleic acid, or comparing a saturated fat-rich diet to a linoleic acid-rich diet, and attributing the differences in blood cholesterol to the saturated fat. Some of them used hydrogenated seed oils as the saturated fat. Although not always consistent, I do think that overall these studies support the idea that saturated fat does have a modest ability to increase blood cholesterol in the short term.

** Although I would love to hear comments from anyone who has done controlled diet trials. I'm sure this method had flaws, as it was applied in the 1960s.

*** Reference cited in the Tecumseh paper: Kannel, W et al. The Framingham Study. An epidemiological Investigation of Cardiovascular Diseases. Section 24: The Framingham Diet Study: Diet and the Regulation of Serum Cholesterol. US Government Printing Office, 1970.

**** Table 5 shows that the Pearson correlation coefficient for saturated fat intake vs. blood cholesterol is not significant; table 6 shows that children in the two highest tertiles of blood cholesterol have a significantly higher intake of saturated fat, unsaturated fat, total fat and sodium than the lowest tertile. The relationship between saturated fat and blood cholesterol shows no evidence of dose-dependence (cholesterol tertiles= 15.6g, 18.4g, 18.5g saturated fat). The investigators made no effort to adjust for confounding variables.

Tuesday, January 11, 2011

Dr. Fat

A blog reader recently made me a Wordle from Whole Health Source. A Wordle is a graphical representation of a text, where the size of each word represents how often it appears. Click on the image for a larger version.

Apparently, the two most common words on this blog are "Dr" and "fat." It occurred to me that Dr. Fat would be a great nom de plume.