Did cooked tubers drive human evolution?

Following last weeks post on the misrepresentation of the Paleo diet in the press, it’s time to examine in more detail the claims of the scientists that sent the newspapers into such a feeding frenzy.Brain_size_evolution

The recent paper entitled “The Importance of Dietary Carbohydrate in Human Evolution.” by Hardy et al, argues that rapid expansion of the human brain 800,000 years was fueled by the consumption of cooked starchy tubers.

The paper’s hypothesis is based around four key arguments:

  1. Brains need glucose, and starchy foods are an abundant source of glucose
  2. Starchy tubers are a reliable food resource available year round in Africa
  3. Cooking transforms starches in tubers making them far more digestible, and
  4. Humans are adapted to eating starches. Unlike other primates, they carry multiple copies of the salivary amylase genes.

I’ll look at each of these four points in turn.

1. Do brains ‘need glucose’?

That ‘brains need glucose’, is of course physiologically true. The brain is indeed dependent on a steady supply of glucose. The problem with this oft quoted factoid is that it is used to support a seemingly logical, but false conclusion:‘…so we need to eat carbohydrates’. As Hardy’s paper acknowledges, the human body is perfectly capable of manufacturing all of the glucose it needs from fats and proteins by the process of gluconeogenesis (literally making new glucose). This process is ramped-up whenever carbohydrate intake is limited. How else would it be possible for humans to survive famine for weeks on end if their brains were dependent on a constant supply of external carbohydrates as a glucose source? To get round this inconvenient truth Hardy et al argue that gluconeogenesis is inefficient, so humans would have preferred concentrated sources of glucose from starchy tubers. This may be a valid argument, but to my mind suffers from several weaknesses, especially for explaining brain evolution, in part because a very low-carb (‘ketogenic’) diet has been shown time and again to protect the brain.

A ketogenic diet is one in which carbohydrates intake is severely restricted (e.g. to less than 30g carbs per day). In such circumstances, which mimic starvation, the body not only manufactures its own glucose at precisely the appropriate amount, but also produces ketones by breaking down dietary or body fat. This process kicks in during starvation, fasting, or when an individual eats a very low-carb /high-fat ketogenic diet.

The brain is perfectly happy to run on a 50/50 mix of ketones and glucose during such times. Indeed this reduced glucose state appears to provide multifactorial neurological protection. Indeed a ketogenic diet is one of the primary treatments for epilepsy and is currently under trial as an adjunct for cancer therapy (especially glioblastoma).

Here are just a few of many papers on the subject if you want to look into it in more depth:

With a low-carb, low-glucose, ketogenic state proving itself so spectacularly protective in neurological problems and for recovery from brain injury, trauma and stroke, is it likely that a high-carb diet was a key driver of brain evolution?

Another point against Hardy’s hypothesis is the unique nature of infant birth – our large brained babies have a layer of subcutaneous fat, unique among primates, that provides ketones for brain fuel before, during and after birth. (Cunnane & Crawford, 2014)

2. What are African wild tubers really like?

One of Hardy’s central arguments is that cooking significantly increased the digestibility of starch-rich tubers, releasing more glucose for brain evolution. By way of example she states that cooking potatoes increases the digestibility of the starch ‘by up to twenty fold’. That’s impressive, but potatoes and other modern root vegetables have undergone artificial selection to create the easily digestible varieties we know today. Potatoes are indigenous to the Americas, and are toxic if eaten raw, so are probably not a good model of the tubers available during hominid evolution in Africa.

Are potatoes and other modern root vegetables anything like the tubers that would have been available to humans during the period of brain evolution in Africa?

A key question then, is whether Hardy’s assertions hold true for wild tuber species typical on the African ecosystem? Luckily we have some new data with which to test her hypothesis.

In a paper published earlier this year in the American Journal of Physical Anthropology Schnorr et al assessed the digestibility of wild tubers exploited by the Hadza foragers of Tanzania. After observing the Hadza methods of cooking and eating wild tubers they took samples to the lab where they simulated Hadza cooking, mastication and digestion in the mouth, stomach and small intestines. Their findings are interesting and challenge several of Hardy’s assertions. To show just how different such tubers are to modern cultivated root vegetables.

Consider how the Hadza eat these wild tubers:

First they roast them in open fires, one end at a time, turning them occasionally for a total of 5 to 15 minutes cooking. Next they peel them, then bite off a piece and chew it for half to three minutes (just pause a moment to consider that!) Then they spit out a wad of fibres called a quid. Depending on the particular tuber, the edible fraction varies from 20% to 80%, with the remainder being inedible fibre or peel. It is clear from this that African wild tubers are not remotely like any modern root vegetables – especially potatoes.

Hadza_ekwa_root_roasting

Hadza man roasting ekwa (Vigna frutescens) tubers – typical of those available during hominid evolution. Recent analysis by Schnorr et al found that only a small fraction of the peeled tubers glucose content (26 ± 8%) was available to digestion. Image courtesy Gary Aitken 2014

For the purpose of comparison, I’ve cobbled together some approximate available glucose values for the Hadza tubers and common supermarket tubers (in both cases fructose content has been excluded as both Hardy’s and Schnorr’s papers focus solely on glucose). The quantities for the Hadza tubers are based on 100g of edible tuber after peeling and discarding the fibrous quid, meaning that the glucose would actually be even more dilute than these figures suggest if they were for the whole tuber. You can see that these Hadza tubers do not come close to the carbohydrate density of modern starchy vegetables like potatoes, being closer to carrots.

Table 1. Available glucose and fibre content of Hadza v Modern tubers (g/100g)

Hadza tubers
(peeled)
Glucose Fibre* Modern tubers
(unpeeled)
 Glucose Fibre
Mak’alitako
(E. entennulifa)
2 10 Carrot 4 3
Shumuko
(V. pseudolablab)
2 15 Parsnip 9 5
Ekwa
(V. frutescens)
3 20 Potato 17 2
Panjuko
(I. transvaalensis)
8 1 Sweet potato 18 3

* Approximate fibre content of edible portion of Hadza tubers

What is clear from these data is that wild Hadza tubers are significantly lower in glucose and higher in fibre than modern root vegetables. The highest level of glucose in the Hadza tubers is comparable with parsnips – which are hardly ‘starchy’. Potatoes – which definitely are starchy – are the least similar to wild African tubers.

3. Does cooking significantly affect African tuber digestibility?

So what about Hardy’s hypothesis, that cooking tubers increased the bio-availability of glucose giving humans a boost in the brain development game? Schnorr et al found that although the Hadza roasted their tubers, this did little to improve digestibility. The main benefit of roasting seemed to be to assist with peeling. Of the four tuber species eaten by the Hadza two showed no change in glucose availability, in the other two the available glucose increased from 38% to 48%, whereas in the forth species it decreased, from 44% to 34%. This makes Hardy’s hypothesis look quite shaky.

Furthermore, the authors estimated that cooking and chewing only liberated 1/3 to 2/3 of the glucose content of the tubers and classified them as “very resistant to digestion”. By contrast, a modern baked potato requires only seconds of chewing and will be almost 100% digested, spiking the blood glucose levels dramatically within minutes of ingestion. Yet potatoes were used extensively in the news paper articles reporting on Hardy’s hypothesis.

It is worth considering what happens to the remaining 1/3 to 2/3 of the undigested starches in the Hadza tubers. These are not necessarily lost, as upon reaching the colon bacteria would get to work on them, converting them into easily absorbed short chain fatty acids – fats, that is. However, Hardy’s hypothesis emphasises tuber eating for a plentiful supply of glucose, not fatty acids, to fuel hominid brain expansion. What the Hadza study shows is that their tubers are a low-glycaemic food, in no way comparable to modern starchy tubers like potatoes. There is actually no contemporary food with which to compare them. We just don’t eat anything so fibrous.

A further interesting finding in the Hazda tuber study is summarised in this graph:

Hadza_tuber_glucose

Schnorr et al, found that larger tubers – those with the highest levels of starch (bottom axis) – were the most resistant to digestion (vertical axis). They concluded that the human digestive system is simply unable to cope with large quantities of these wild tubers at one sitting.

This means that absorption of glucose may be inhibited by the higher glucose content, such as found in starch, since these glucose polymers can resist digestion or overwhelm the enzyme activity in the small intestine. Therefore, a “glucose”-rich tuber does not necessarily mean more glucose, proportionally, is directly available to the consumer.

If larger roots provide a lower percentage of available glucose Hardy’s hypothesis appears less tenable still, relying as it does on the widespread availability of starchy roots to fuel an increasingly metabolically-expensive brain. In contradistinction, Schnorr’s work suggests humans have a physiological limit on the quantity of wild tubers they can digest.

4. What is the significance of human salivary amylase adaptations

At some point in human evolution mutations took place resulting in multiple copies of the genes for salivary amylase, the enzyme that breaks down starch. It had been assumed that this took place during the neolithic switch to agriculture as an adaptation to the new staple – starch rich cereal grains. However, recent evidence indicates this adaptation goes back further in human evolution, although how far back remains a matter of debate.

Hardy et al suggest the multiple amylase gene copy number mutation arose in conjunction with human exploitation of cooked tubers.

We propose that after cooking became widespread, starch digestion became the rate-limiting step in starch utilization, and the coevolution of cooking and copy number variation (CNV) in the [amylase] gene(s) increased availability of preformed dietary glucose, permitting the acceleration in brain size increase observed from the Middle Pleistocene onward. – Hardy et al, 2015

This may or may not turn out to be correct. If so, however, it does little to rescue Hardy’s hypothesis, because, as we have seen these tubers have to be roasted, then masticated for minutes to extract minimal amounts of glucose. Rather than being a driving force for brain expansion, exploitation of tubers appears more like a diversification strategy – a valuable source of low-glycaemic calories that would have been helpful towards meeting the daily caloric needs when no better sources of nutrition were available, not the massive brain-fueling dose of glucose we associate with modern starchy vegetables.

Tubers are a reliable fallback food

Hadza_food_preferences

Hadza food preferences, by sex. From Marlowe & Berbesque Tubers as fallback foods and their impact on Hadza hunter-gatherers. American Journal of Physical Anthropology, 2009

Harvey et al argue that tubers would have been a highly prized food by our ancestors, however this does not appear to be the case with modern hunter gatherers. For example the Hadza rely on five main categories of foods: tubers, berries, meat, baobab, and honey. In a 2009 study in the American Journal of Anthropology in 2009 Marlowe & Berbesque found that the Hadza rated tubers as the least preferred of their foods. They rated all other foods more highly than tubers.

As such tubers are seen very much as a fallback food – something to eat when there is insufficient of the good stuff, to top up calories, or to avoid starving.

The use of tubers as a  fallback food is also observed in primates, who will resort to collecting underground storage organs on land or from water plants when preferred foods are in short supply. (Savanna chimpanzees use tools to harvest the underground storage organs of plants, R. Adriana Hernandez-Aguila et al, 2007)

Schnorr et al, also found that on average tubers only made up about 20% by weight of the food brought back to camp each day. The caloric contribution of tubers is not well established, but is likely to be low. Indeed Schnorr and his team, found that the most commonly eaten Hadza tubers were 80-90% water by weight, indicating that they make only a small contribution to daily calories, perhaps as little as 5 or 10%. They suggest these watery tubers “may actually be more important for their moisture rather than caloric contribution”.

So where does this leave the hypothesis of Harvey and her team? Taking account all of the above factors, and realising that wild African tubers are nothing like the familiar spud, the idea that cooked starchy tubers drove human brain evolution starts to look far less plausible. Certainly, tubers have always been widely available, but, apparently not preferred, at least if the Hadza are anything to go by.

Their primary role as a food of last resort makes sense once their low-glucose, high-fibre nature is understood and the image of the modern potato is banished from one’s consideration. The effects of cooking on starches and the salivary amylase adaptations – assuming they did indeed arise early in human brain expansion – rather than providing evidence for a key driving force in human evolution begin to look like adaptations for survival in extremis, when higher quality foods were not available.

Instead of being the driving force behind human brain expansion, Hardy et al‘s hypothesis might well come to be seen as little more than a rather peripheral aspect of dietary adaptation.

An alternative hypothesis on the role of wild tubers on human brain evolution

As you can see from the graph at the top of this post, human brain expansion began two million years earlier than Hardy’s date of 800,000 years ago. Isotopic evidence from that period shows that early humans (Australopithicus) had ‘C4 plants’ in their diet which primarily indicates grasses and sedges (Lee-Thorp J et al, 2012).
Water_chestnut_plantThis is distinct from other primates at the time, which were eating mainly C3 plants – indicative of leaves, nuts and fruits from forests. During this period a dryer climate was creating more grassland and early man seems to be taking advantage of this new savanna environment. However, there are two ways the C4 isotope could appear in Australopiths – either they were eating grasses directly, or they were eating herbivores that ate those grasses. Quite possibly both of these took place.

What is not likely, however, is that early humans ate the foliage or seeds of grasses – these being virtually indigestible – but rather they were eating the underground storage organs (bulbs and tubers) of various species of sedge (Laden G, Wrangham R, 2005). Sedges grow in and around water, and often have tubers which can easily be pulled from the soft mub at the bottom of ponds or river sides (see image, right).

Although small, they can be collected year round and, importantly, can be eaten raw. This makes tuber eating a possible factor in early human brain expansion, prior to the domestication of fire.

You may already have eaten such tubers yourself. The most common culinary varieties being water chestnuts (Eleocharis dulcis) – popular in oriental cooking – and, less well known, tiger nuts (Cyperus esculentus).

Water chestnuts and tiger nuts are quite high in digestible carbohydrates and fibre, and along with similar species certainly present a more realistic food source for early humans. Indeed it has been estimated that just 150-200g of tiger nuts contains sufficient essential fats to satisfy human needs (Nathaniel J. Dominy, ‘Hominins living on the sedge’ 2012). The fact that other primates make use of them as fallback foods suggests it is unlikely that on their own they could account for the unique evolutionary force that drove human brain expansion over the subsequent 2.5 million years, but it may have contributed to getting the process started.

Namibian_frogs

Namibian boy gathering frogs for cooking.

A much more exciting hypothesis, is that early humans specialised in water resources of all kinds. This would have provided fish and shellfish, along with water tubers from sedges and cyprus plants.

Broader water resource utilisation may also explain why humans became bipedal – to wade (Kuliukas, 2002). Whilst sedge tubers may have provided glucose to fuel the early brain, we have seen that this is not necessarily a pre-requisite for brain expansion as gluconeogenesis and ketogenesis can provide all the fuel a human brain needs. More important for brain development is access to sources of brain-building fatty acids and minerals (‘brain selective nutrients’) such as iodine, iron, zinc, selenium and omega-3 fatty acids (especially DHA) – dietary components that in all other land mammals place significant limits on the relative size of their brains.

A shore-based diet, i.e., fish, molluscs, crustaceans, frogs, bird’s eggs and aquatic plants, provides the richest known dietary sources of brain selective nutrients. Regular access to these foods by the early hominin lineage that evolved into humans would therefore have helped free the nutritional constraint on primate brain development and function.
– Cunnane & Crawford, 2014

Addendum. Why tubers are OK as part of an Ancestral or Paleo diet.

The arguments above are to illustrate that the evolution of the human brain probably did not depend on access to cooked starchy tubers as Hardy et al have claimed.  However, I do not believe there is anything wrong with eating root vegetables, and I would recommend them as part of a healthy modern Paleo diet, especially with some tiger nuts and water chestnuts thrown in. Our supermarket root vegetables are still real, unprocessed, foods – much closer to wild tubers than virtually any other modern carbohydrate source. Selective breeding has produced roots that are far more appetising and less fibrous than their wild counterparts, so we may see them as somewhat better than a fallback food. That said, I would have to be in dire straights before trying to survive on carrots alone, and although I like my meat and two veg, I guess my preference chart would not look too dissimilar to the Hadzas’!

GE DIGITAL CAMERA

Villagers gather to share their crop of sweet potatoes which have been grown for hundreds of years in the highlands of Papua New Guinea

Tubers such as sweet potatoes and taro are staple foods of some very healthy primitive agriculturalist tribes such as the Kitavans, Tukisenta and the highlanders of Papua New Guinea. These people eat high carbohydrate diets, based around starchy tubers but suffer none of the western diseases of civilisation when they stick to their traditional diets. However, these examples do nothing to rescue Hardy’s hypothesis as the agricultural practices necessary to grow such crops have only been around for at most 10,000 years. Without cultivation, such tubers could only provide a minor part of the diet. That said, they do point to the ability of humans to thrive on a high carbohydrate diet where these carbs are primarily from unrefined starchy vegetables. The key difference between such ancestral diets and modern western diets may well be the dominance of foods based on refined grains which have far higher carbohydrate densities, as illustrated in the charts below.

The bottom line is, ancestral and paleo foods have always included tubers, root vegetables and bulbs. However, it seems likely that other dietary factors were responsible for the evolution of the unique human brain, and we would do well to pay attention to those foods first: fish, shellfish, eggs… and frogs!

4 thoughts on “Did cooked tubers drive human evolution?

  1. I don’t know why you spend so much time investigating tubers when honey was just as an important source of carbohydrates.

    The Hadza are well known for eating significant quantities of honey for 3/4 of the year—even consuming as much as 80% of their calories from honey during the rainy season.

    As evidence of honey’s importance in hominid evolution, the Greater Honeyguide (Indicator Indicator) is a wild bird that evolved to “talk” to humans and guide them to significant caches of honey. Anthropologist Richard Wrangham called this remarkable interaction “the most developed, co-evolved, mutually-helpful relationship between any mammal and any bird.” The relationship is so engrained in the Honeyguide, to this day, despite the fact that the Masai do not even reward the bird.

    In order for this behavior to have evolved, it is believed that hominids and honeyguides have mutually developed together over the past 3 to 5 million years.

    http://www.sciencedirect.com/science/article/pii/S1090513814000877

    Even chimpanzees make rudimentary tools to harvest honey.
    http://www.sciencedirect.com/science/article/pii/S0047248409000566

    • Thanks for the comment J.
      Yes, I am aware of the importance of honey among hunter-gatherer cultures and the Honeyguide which is just amazing. Thanks for the links.

      I’m not sure why you started by questioning why we ‘spent so much time investigating tubers’. It’s because of the paper by Hardey et al. that posited tubers as the driving force behind hominid brain expansion, which I think is flawed. I agree with you, that if we were looking for a source of carbs as an explanation for increased encephalisation, then honey would fit the bill just fine. However, as virtually all other primates have access to carbs from fruit and nuts etc. it seems unlikely that either tubers or honey are going to provide the key to the hominid brain puzzle.

  2. Pingback: Cliodynamica » When Did Human Beings Start Using Fire? Wrangham versus Cordain

  3. Pingback: Peter Turchin » When Did Human Beings Start Using Fire? Wrangham versus Cordain

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s