May News Round-Up

In_the_News_May· Potatoes and hypertension
· Antibiotics, depression and phages
· Gluten debate
· Low-carb diets good for diabetes
· Salt does not raise BP (yet again)
· Medical errors – high death toll
· Vitamin D & sunshine
· 50y of changing UK food habits

Potato consumption linked to raised blood pressure

The Guardian (17th May) covered this story as well as any, after a study found that those who ate potatoes four or more times per week had a small, but significant, increased risk of hypertentsion (high blood pressure) compared to those eating them less than once per week. This link applied to boiled, mashed or baked potatoes and chips (aka French fries), but weirdly, not to crisps (aka potato chips in the USA). The study authors, suggest the effect is caused by the high carb content raising blood sugar. Interestingly, they point to trials that show high protein and high fat diets lower blood pressure. (See BMJ paper here).

Grass-Fed Nation: Book Review

The Telegraph (26th May) reviews a new book by Graham Harvey, script writer of The Archer’s agricultural story lines and one of the excellent speakers at our Grass Fed Meat Revolution in 2014.

Unfortunately, British dairy farming is moving in the opposite direction with the creeping introduction of US style mega-dairies (now numbering 100+), where cows are raised permanently indoors. The Telegraph (1st June) reports on this disturbing trend.

Antibiotics, depression and resistance – Phages to the rescue?

The Mail (24th May)  reports on Israeli research showing that just one course of antibiotics is linked to an increased incidence of depression, probably due to changes in gut microbes.

Even more depressing is the news that a woman in the US was found to have a bacterial infection that is resistant to colistin – the antibiotic of last resort (BBC News, 27th May).

The belated fightback by British doctors, however, is starting to bite with The Telegraph (25th May) reporting that GPs have slashed their use of antibiotics in the last 12 months. Was this due to their growing awareness of over-prescription and a public spirited determination to tackle the problem? Or was it because the government brought in financial incentives to encourage them? Oh… the latter. Well I never.

Phage

Phages attack a bacteria (Wikimedia)

With few new antibiotics on the horizon, research is turning to alternative means to treat infections, including bacteriophages – viruses that target and kill specific bacteria. The Independent (26th May) reports one such advance, with a phage found in a pond which attacks a type of multi-drug resistant bacteria. Interestingly, phage therapy was widely developed in the former USSR during the cold war, as they did not have access to western antibiotics. Phage therapy is still widely used in Russia, Georgia and Poland. You can read more in this 2014 Nature article.

Gluten controversy

The gluten-free ‘fad’ comes in for criticism with headlines such as “Gluten-Free Diets Are Not Necessarily Healthier, Doctors Warn” (Live Science 25th May, ). Yes indeed, gluten-free bread, biscuits, cakes and other simulacra are often chock-full of additives in an attempt to recreate gluten’s unique glutinousness. Additionally, gluten-free flours (like rice and corn) can be high in heavy metals such as arsenic, which has resulted in at least one recorded case of arsenic poisoning. So, yes, we concur: avoid all grains and don’t go shopping down the gluten free aisle! Eat more fish, meat, fruit, nuts and vegetables, i.e. real food as opposed to ‘products’ or as I like to call them ‘food like substances’.

The Mail (16th May) reports that supermarket gluten free bread is high in fat (shock horror), suggesting that this is a problem. To my mind, it’s not the fat you should worry about (although I wouldn’t reckon on the quality of their industrial oils), it’s the grain and chemical concoctions that are dodgy. My coconut keto-bread recipe is mega-high fat and grain free. Alternatively, my almond bread is versatile, delicious and can be toasted and made into sandwiches. Both are low GI, nutrient dense alternatives, not fake food.

In the same Mail Online article is a video reporting on links between gluten and depressions. Worth a click:

Gluten-Depression-video

High-fat, low-carb diet takes on the mainstream – round two, ding ding!

The National Obesity Forum came out fighting this month with “Official advice on low-fat diet and cholesterol is wrong, says health charity” the Guardian (23rd May). They argue (as do I), that type 2 diabetes can be better managed on a low-carb diet, rather than the recommended low-fat approach. However, this has lead to a string of pugilistic condemnations from the nutritional orthodoxy. Public Health England weighed in calling the report “irresponsible” while The British Dietetic Association, warned that advising people to eat more saturated fat “could be extremely dangerous”. (The Observer 28th May)

However, we think The Telegraph (31st May) gets in the final knock-out punch with “Low-carb diet helps control diabetes, new study suggests”.

That study was conducted after an online revolt by patients in which 120,000 people signed up to the “low-carb” diet plan launched by diabetes.co.uk in a backlash against official advice.

By rejecting guidelines and eating a diet low in starchy foods but high in protein and “good” saturated fats, such as olive oil and nuts, more than 80 percent of the patients said that they had lost weight, with 10 percent shedding 9kg or more.

More than 70 per cent of participants experienced improvements of blood glucose, and a fifth said they no longer needed drugs to regulate blood glucose by the end of the ten-week plan. (my emphasis)

KERPOW! Take that British Dietetic Association. WHAM! Stick that in your low-fat pipe National Health England.

U turn on salt recommendations? Probably not…

Further challenges to the orthodoxy were found in Mail Online (20th May) reporting on a study published in the lancet, in which “a global study found that, contrary to past belief, low-salt diets may not be beneficial. Rather, they can increase the risk of cardiovascular disease and death, compared with average salt consumption.”

Of course this led to the usual condemnatory remarks from WHO representatives who labelled the study as ‘bad science’.

My view is that lowering salt may be beneficial for some individuals with hypertension, especially those with genetic SNPs for salt metabolism, but for most of the population their is little evidence of benefit. You can see the numerous conflicting studies linked to salt here, and read our post on salt here.

Iatrogenic deaths

Medical errors have been identified as the third leading cause of deaths in the US, causing over 251,000 deaths annually, after heart disease and cancer, respectively, according to researchers at Johns Hopkins University. (Care2, 5th May, BMJ, 3rd May)

According to the study, “Medical error has been defined as an unintended act (either of omission or commission) or one that does not achieve its intended outcome, the failure of a planned action to be completed as intended (an error of execution), the use of a wrong plan to achieve an aim (an error of planning), or a deviation from the process of care that may or may not cause harm to the patient.” Amazingly, no form of medical error ever appears as a cause of death on a death certificate.

The situation is no less rosy on this side of the pond, with the Mail Online (10th May) reporting “Thousands of heart victims killed by poor care: More than 33,000 people died needlessly over the past few years because of shocking flaws in NHS treatment”. I don’t need telling about the hundreds of patients that have come to me over the years after being so poorly served by an incompetent NHS, indeed my own mother died from heart surgery that ‘went wrong’. Her surgeon humbly admitted to me personally that if he hadn’t done the operation she would still be alive. For all that, he still absconded from the hospital presumably back to Egypt, and I have not pursued that story further!

Vitamin D and Sunshine

Well, we had a handful of sunny days in May, so I suppose we can’t complain…

Our related post: Human photosynthesis – Beyond vitamin-D

Info-graphic of the month: Changes in British food shopping, 1974-2014

33CB339100000578-3571702-Graphs_show_the_biggest_rise_in_which_food_categories_families_p-a-38_1462357927762

The above graph, courtesy of The Mail (4th May), shows changing UK food habits over the last half century. Interesting! What do you think?

Tweet of the month

May-Tweet

 

Did cooked tubers drive human evolution?

Following last weeks post on the misrepresentation of the Paleo diet in the press, it’s time to examine in more detail the claims of the scientists that sent the newspapers into such a feeding frenzy.Brain_size_evolution

The recent paper entitled “The Importance of Dietary Carbohydrate in Human Evolution.” by Hardy et al, argues that rapid expansion of the human brain 800,000 years was fueled by the consumption of cooked starchy tubers.

The paper’s hypothesis is based around four key arguments:

  1. Brains need glucose, and starchy foods are an abundant source of glucose
  2. Starchy tubers are a reliable food resource available year round in Africa
  3. Cooking transforms starches in tubers making them far more digestible, and
  4. Humans are adapted to eating starches. Unlike other primates, they carry multiple copies of the salivary amylase genes.

I’ll look at each of these four points in turn.

1. Do brains ‘need glucose’?

That ‘brains need glucose’, is of course physiologically true. The brain is indeed dependent on a steady supply of glucose. The problem with this oft quoted factoid is that it is used to support a seemingly logical, but false conclusion:‘…so we need to eat carbohydrates’. As Hardy’s paper acknowledges, the human body is perfectly capable of manufacturing all of the glucose it needs from fats and proteins by the process of gluconeogenesis (literally making new glucose). This process is ramped-up whenever carbohydrate intake is limited. How else would it be possible for humans to survive famine for weeks on end if their brains were dependent on a constant supply of external carbohydrates as a glucose source? To get round this inconvenient truth Hardy et al argue that gluconeogenesis is inefficient, so humans would have preferred concentrated sources of glucose from starchy tubers. This may be a valid argument, but to my mind suffers from several weaknesses, especially for explaining brain evolution, in part because a very low-carb (‘ketogenic’) diet has been shown time and again to protect the brain.

A ketogenic diet is one in which carbohydrates intake is severely restricted (e.g. to less than 30g carbs per day). In such circumstances, which mimic starvation, the body not only manufactures its own glucose at precisely the appropriate amount, but also produces ketones by breaking down dietary or body fat. This process kicks in during starvation, fasting, or when an individual eats a very low-carb /high-fat ketogenic diet.

The brain is perfectly happy to run on a 50/50 mix of ketones and glucose during such times. Indeed this reduced glucose state appears to provide multifactorial neurological protection. Indeed a ketogenic diet is one of the primary treatments for epilepsy and is currently under trial as an adjunct for cancer therapy (especially glioblastoma).

Here are just a few of many papers on the subject if you want to look into it in more depth:

With a low-carb, low-glucose, ketogenic state proving itself so spectacularly protective in neurological problems and for recovery from brain injury, trauma and stroke, is it likely that a high-carb diet was a key driver of brain evolution?

Another point against Hardy’s hypothesis is the unique nature of infant birth – our large brained babies have a layer of subcutaneous fat, unique among primates, that provides ketones for brain fuel before, during and after birth. (Cunnane & Crawford, 2014)

2. What are African wild tubers really like?

One of Hardy’s central arguments is that cooking significantly increased the digestibility of starch-rich tubers, releasing more glucose for brain evolution. By way of example she states that cooking potatoes increases the digestibility of the starch ‘by up to twenty fold’. That’s impressive, but potatoes and other modern root vegetables have undergone artificial selection to create the easily digestible varieties we know today. Potatoes are indigenous to the Americas, and are toxic if eaten raw, so are probably not a good model of the tubers available during hominid evolution in Africa.

Are potatoes and other modern root vegetables anything like the tubers that would have been available to humans during the period of brain evolution in Africa?

A key question then, is whether Hardy’s assertions hold true for wild tuber species typical on the African ecosystem? Luckily we have some new data with which to test her hypothesis.

In a paper published earlier this year in the American Journal of Physical Anthropology Schnorr et al assessed the digestibility of wild tubers exploited by the Hadza foragers of Tanzania. After observing the Hadza methods of cooking and eating wild tubers they took samples to the lab where they simulated Hadza cooking, mastication and digestion in the mouth, stomach and small intestines. Their findings are interesting and challenge several of Hardy’s assertions. To show just how different such tubers are to modern cultivated root vegetables.

Consider how the Hadza eat these wild tubers:

First they roast them in open fires, one end at a time, turning them occasionally for a total of 5 to 15 minutes cooking. Next they peel them, then bite off a piece and chew it for half to three minutes (just pause a moment to consider that!) Then they spit out a wad of fibres called a quid. Depending on the particular tuber, the edible fraction varies from 20% to 80%, with the remainder being inedible fibre or peel. It is clear from this that African wild tubers are not remotely like any modern root vegetables – especially potatoes.

Hadza_ekwa_root_roasting

Hadza man roasting ekwa (Vigna frutescens) tubers – typical of those available during hominid evolution. Recent analysis by Schnorr et al found that only a small fraction of the peeled tubers glucose content (26 ± 8%) was available to digestion. Image courtesy Gary Aitken 2014

For the purpose of comparison, I’ve cobbled together some approximate available glucose values for the Hadza tubers and common supermarket tubers (in both cases fructose content has been excluded as both Hardy’s and Schnorr’s papers focus solely on glucose). The quantities for the Hadza tubers are based on 100g of edible tuber after peeling and discarding the fibrous quid, meaning that the glucose would actually be even more dilute than these figures suggest if they were for the whole tuber. You can see that these Hadza tubers do not come close to the carbohydrate density of modern starchy vegetables like potatoes, being closer to carrots.

Table 1. Available glucose and fibre content of Hadza v Modern tubers (g/100g)

Hadza tubers
(peeled)
Glucose Fibre* Modern tubers
(unpeeled)
 Glucose Fibre
Mak’alitako
(E. entennulifa)
2 10 Carrot 4 3
Shumuko
(V. pseudolablab)
2 15 Parsnip 9 5
Ekwa
(V. frutescens)
3 20 Potato 17 2
Panjuko
(I. transvaalensis)
8 1 Sweet potato 18 3

* Approximate fibre content of edible portion of Hadza tubers

What is clear from these data is that wild Hadza tubers are significantly lower in glucose and higher in fibre than modern root vegetables. The highest level of glucose in the Hadza tubers is comparable with parsnips – which are hardly ‘starchy’. Potatoes – which definitely are starchy – are the least similar to wild African tubers.

3. Does cooking significantly affect African tuber digestibility?

So what about Hardy’s hypothesis, that cooking tubers increased the bio-availability of glucose giving humans a boost in the brain development game? Schnorr et al found that although the Hadza roasted their tubers, this did little to improve digestibility. The main benefit of roasting seemed to be to assist with peeling. Of the four tuber species eaten by the Hadza two showed no change in glucose availability, in the other two the available glucose increased from 38% to 48%, whereas in the forth species it decreased, from 44% to 34%. This makes Hardy’s hypothesis look quite shaky.

Furthermore, the authors estimated that cooking and chewing only liberated 1/3 to 2/3 of the glucose content of the tubers and classified them as “very resistant to digestion”. By contrast, a modern baked potato requires only seconds of chewing and will be almost 100% digested, spiking the blood glucose levels dramatically within minutes of ingestion. Yet potatoes were used extensively in the news paper articles reporting on Hardy’s hypothesis.

It is worth considering what happens to the remaining 1/3 to 2/3 of the undigested starches in the Hadza tubers. These are not necessarily lost, as upon reaching the colon bacteria would get to work on them, converting them into easily absorbed short chain fatty acids – fats, that is. However, Hardy’s hypothesis emphasises tuber eating for a plentiful supply of glucose, not fatty acids, to fuel hominid brain expansion. What the Hadza study shows is that their tubers are a low-glycaemic food, in no way comparable to modern starchy tubers like potatoes. There is actually no contemporary food with which to compare them. We just don’t eat anything so fibrous.

A further interesting finding in the Hazda tuber study is summarised in this graph:

Hadza_tuber_glucose

Schnorr et al, found that larger tubers – those with the highest levels of starch (bottom axis) – were the most resistant to digestion (vertical axis). They concluded that the human digestive system is simply unable to cope with large quantities of these wild tubers at one sitting.

This means that absorption of glucose may be inhibited by the higher glucose content, such as found in starch, since these glucose polymers can resist digestion or overwhelm the enzyme activity in the small intestine. Therefore, a “glucose”-rich tuber does not necessarily mean more glucose, proportionally, is directly available to the consumer.

If larger roots provide a lower percentage of available glucose Hardy’s hypothesis appears less tenable still, relying as it does on the widespread availability of starchy roots to fuel an increasingly metabolically-expensive brain. In contradistinction, Schnorr’s work suggests humans have a physiological limit on the quantity of wild tubers they can digest.

4. What is the significance of human salivary amylase adaptations

At some point in human evolution mutations took place resulting in multiple copies of the genes for salivary amylase, the enzyme that breaks down starch. It had been assumed that this took place during the neolithic switch to agriculture as an adaptation to the new staple – starch rich cereal grains. However, recent evidence indicates this adaptation goes back further in human evolution, although how far back remains a matter of debate.

Hardy et al suggest the multiple amylase gene copy number mutation arose in conjunction with human exploitation of cooked tubers.

We propose that after cooking became widespread, starch digestion became the rate-limiting step in starch utilization, and the coevolution of cooking and copy number variation (CNV) in the [amylase] gene(s) increased availability of preformed dietary glucose, permitting the acceleration in brain size increase observed from the Middle Pleistocene onward. – Hardy et al, 2015

This may or may not turn out to be correct. If so, however, it does little to rescue Hardy’s hypothesis, because, as we have seen these tubers have to be roasted, then masticated for minutes to extract minimal amounts of glucose. Rather than being a driving force for brain expansion, exploitation of tubers appears more like a diversification strategy – a valuable source of low-glycaemic calories that would have been helpful towards meeting the daily caloric needs when no better sources of nutrition were available, not the massive brain-fueling dose of glucose we associate with modern starchy vegetables.

Tubers are a reliable fallback food

Hadza_food_preferences

Hadza food preferences, by sex. From Marlowe & Berbesque Tubers as fallback foods and their impact on Hadza hunter-gatherers. American Journal of Physical Anthropology, 2009

Harvey et al argue that tubers would have been a highly prized food by our ancestors, however this does not appear to be the case with modern hunter gatherers. For example the Hadza rely on five main categories of foods: tubers, berries, meat, baobab, and honey. In a 2009 study in the American Journal of Anthropology in 2009 Marlowe & Berbesque found that the Hadza rated tubers as the least preferred of their foods. They rated all other foods more highly than tubers.

As such tubers are seen very much as a fallback food – something to eat when there is insufficient of the good stuff, to top up calories, or to avoid starving.

The use of tubers as a  fallback food is also observed in primates, who will resort to collecting underground storage organs on land or from water plants when preferred foods are in short supply. (Savanna chimpanzees use tools to harvest the underground storage organs of plants, R. Adriana Hernandez-Aguila et al, 2007)

Schnorr et al, also found that on average tubers only made up about 20% by weight of the food brought back to camp each day. The caloric contribution of tubers is not well established, but is likely to be low. Indeed Schnorr and his team, found that the most commonly eaten Hadza tubers were 80-90% water by weight, indicating that they make only a small contribution to daily calories, perhaps as little as 5 or 10%. They suggest these watery tubers “may actually be more important for their moisture rather than caloric contribution”.

So where does this leave the hypothesis of Harvey and her team? Taking account all of the above factors, and realising that wild African tubers are nothing like the familiar spud, the idea that cooked starchy tubers drove human brain evolution starts to look far less plausible. Certainly, tubers have always been widely available, but, apparently not preferred, at least if the Hadza are anything to go by.

Their primary role as a food of last resort makes sense once their low-glucose, high-fibre nature is understood and the image of the modern potato is banished from one’s consideration. The effects of cooking on starches and the salivary amylase adaptations – assuming they did indeed arise early in human brain expansion – rather than providing evidence for a key driving force in human evolution begin to look like adaptations for survival in extremis, when higher quality foods were not available.

Instead of being the driving force behind human brain expansion, Hardy et al‘s hypothesis might well come to be seen as little more than a rather peripheral aspect of dietary adaptation.

An alternative hypothesis on the role of wild tubers on human brain evolution

As you can see from the graph at the top of this post, human brain expansion began two million years earlier than Hardy’s date of 800,000 years ago. Isotopic evidence from that period shows that early humans (Australopithicus) had ‘C4 plants’ in their diet which primarily indicates grasses and sedges (Lee-Thorp J et al, 2012).
Water_chestnut_plantThis is distinct from other primates at the time, which were eating mainly C3 plants – indicative of leaves, nuts and fruits from forests. During this period a dryer climate was creating more grassland and early man seems to be taking advantage of this new savanna environment. However, there are two ways the C4 isotope could appear in Australopiths – either they were eating grasses directly, or they were eating herbivores that ate those grasses. Quite possibly both of these took place.

What is not likely, however, is that early humans ate the foliage or seeds of grasses – these being virtually indigestible – but rather they were eating the underground storage organs (bulbs and tubers) of various species of sedge (Laden G, Wrangham R, 2005). Sedges grow in and around water, and often have tubers which can easily be pulled from the soft mub at the bottom of ponds or river sides (see image, right).

Although small, they can be collected year round and, importantly, can be eaten raw. This makes tuber eating a possible factor in early human brain expansion, prior to the domestication of fire.

You may already have eaten such tubers yourself. The most common culinary varieties being water chestnuts (Eleocharis dulcis) – popular in oriental cooking – and, less well known, tiger nuts (Cyperus esculentus).

Water chestnuts and tiger nuts are quite high in digestible carbohydrates and fibre, and along with similar species certainly present a more realistic food source for early humans. Indeed it has been estimated that just 150-200g of tiger nuts contains sufficient essential fats to satisfy human needs (Nathaniel J. Dominy, ‘Hominins living on the sedge’ 2012). The fact that other primates make use of them as fallback foods suggests it is unlikely that on their own they could account for the unique evolutionary force that drove human brain expansion over the subsequent 2.5 million years, but it may have contributed to getting the process started.

Namibian_frogs

Namibian boy gathering frogs for cooking.

A much more exciting hypothesis, is that early humans specialised in water resources of all kinds. This would have provided fish and shellfish, along with water tubers from sedges and cyprus plants.

Broader water resource utilisation may also explain why humans became bipedal – to wade (Kuliukas, 2002). Whilst sedge tubers may have provided glucose to fuel the early brain, we have seen that this is not necessarily a pre-requisite for brain expansion as gluconeogenesis and ketogenesis can provide all the fuel a human brain needs. More important for brain development is access to sources of brain-building fatty acids and minerals (‘brain selective nutrients’) such as iodine, iron, zinc, selenium and omega-3 fatty acids (especially DHA) – dietary components that in all other land mammals place significant limits on the relative size of their brains.

A shore-based diet, i.e., fish, molluscs, crustaceans, frogs, bird’s eggs and aquatic plants, provides the richest known dietary sources of brain selective nutrients. Regular access to these foods by the early hominin lineage that evolved into humans would therefore have helped free the nutritional constraint on primate brain development and function.
– Cunnane & Crawford, 2014

Addendum. Why tubers are OK as part of an Ancestral or Paleo diet.

The arguments above are to illustrate that the evolution of the human brain probably did not depend on access to cooked starchy tubers as Hardy et al have claimed.  However, I do not believe there is anything wrong with eating root vegetables, and I would recommend them as part of a healthy modern Paleo diet, especially with some tiger nuts and water chestnuts thrown in. Our supermarket root vegetables are still real, unprocessed, foods – much closer to wild tubers than virtually any other modern carbohydrate source. Selective breeding has produced roots that are far more appetising and less fibrous than their wild counterparts, so we may see them as somewhat better than a fallback food. That said, I would have to be in dire straights before trying to survive on carrots alone, and although I like my meat and two veg, I guess my preference chart would not look too dissimilar to the Hadzas’!

GE DIGITAL CAMERA

Villagers gather to share their crop of sweet potatoes which have been grown for hundreds of years in the highlands of Papua New Guinea

Tubers such as sweet potatoes and taro are staple foods of some very healthy primitive agriculturalist tribes such as the Kitavans, Tukisenta and the highlanders of Papua New Guinea. These people eat high carbohydrate diets, based around starchy tubers but suffer none of the western diseases of civilisation when they stick to their traditional diets. However, these examples do nothing to rescue Hardy’s hypothesis as the agricultural practices necessary to grow such crops have only been around for at most 10,000 years. Without cultivation, such tubers could only provide a minor part of the diet. That said, they do point to the ability of humans to thrive on a high carbohydrate diet where these carbs are primarily from unrefined starchy vegetables. The key difference between such ancestral diets and modern western diets may well be the dominance of foods based on refined grains which have far higher carbohydrate densities, as illustrated in the charts below.

The bottom line is, ancestral and paleo foods have always included tubers, root vegetables and bulbs. However, it seems likely that other dietary factors were responsible for the evolution of the unique human brain, and we would do well to pay attention to those foods first: fish, shellfish, eggs… and frogs!