Did cooked tubers drive human evolution?

Following last weeks post on the misrepresentation of the Paleo diet in the press, it’s time to examine in more detail the claims of the scientists that sent the newspapers into such a feeding frenzy.Brain_size_evolution

The recent paper entitled “The Importance of Dietary Carbohydrate in Human Evolution.” by Hardy et al, argues that rapid expansion of the human brain 800,000 years was fueled by the consumption of cooked starchy tubers.

The paper’s hypothesis is based around four key arguments:

  1. Brains need glucose, and starchy foods are an abundant source of glucose
  2. Starchy tubers are a reliable food resource available year round in Africa
  3. Cooking transforms starches in tubers making them far more digestible, and
  4. Humans are adapted to eating starches. Unlike other primates, they carry multiple copies of the salivary amylase genes.

I’ll look at each of these four points in turn.

1. Do brains ‘need glucose’?

That ‘brains need glucose’, is of course physiologically true. The brain is indeed dependent on a steady supply of glucose. The problem with this oft quoted factoid is that it is used to support a seemingly logical, but false conclusion:‘…so we need to eat carbohydrates’. As Hardy’s paper acknowledges, the human body is perfectly capable of manufacturing all of the glucose it needs from fats and proteins by the process of gluconeogenesis (literally making new glucose). This process is ramped-up whenever carbohydrate intake is limited. How else would it be possible for humans to survive famine for weeks on end if their brains were dependent on a constant supply of external carbohydrates as a glucose source? To get round this inconvenient truth Hardy et al argue that gluconeogenesis is inefficient, so humans would have preferred concentrated sources of glucose from starchy tubers. This may be a valid argument, but to my mind suffers from several weaknesses, especially for explaining brain evolution, in part because a very low-carb (‘ketogenic’) diet has been shown time and again to protect the brain.

A ketogenic diet is one in which carbohydrates intake is severely restricted (e.g. to less than 30g carbs per day). In such circumstances, which mimic starvation, the body not only manufactures its own glucose at precisely the appropriate amount, but also produces ketones by breaking down dietary or body fat. This process kicks in during starvation, fasting, or when an individual eats a very low-carb /high-fat ketogenic diet.

The brain is perfectly happy to run on a 50/50 mix of ketones and glucose during such times. Indeed this reduced glucose state appears to provide multifactorial neurological protection. Indeed a ketogenic diet is one of the primary treatments for epilepsy and is currently under trial as an adjunct for cancer therapy (especially glioblastoma).

Here are just a few of many papers on the subject if you want to look into it in more depth:

With a low-carb, low-glucose, ketogenic state proving itself so spectacularly protective in neurological problems and for recovery from brain injury, trauma and stroke, is it likely that a high-carb diet was a key driver of brain evolution?

Another point against Hardy’s hypothesis is the unique nature of infant birth – our large brained babies have a layer of subcutaneous fat, unique among primates, that provides ketones for brain fuel before, during and after birth. (Cunnane & Crawford, 2014)

2. What are African wild tubers really like?

One of Hardy’s central arguments is that cooking significantly increased the digestibility of starch-rich tubers, releasing more glucose for brain evolution. By way of example she states that cooking potatoes increases the digestibility of the starch ‘by up to twenty fold’. That’s impressive, but potatoes and other modern root vegetables have undergone artificial selection to create the easily digestible varieties we know today. Potatoes are indigenous to the Americas, and are toxic if eaten raw, so are probably not a good model of the tubers available during hominid evolution in Africa.

Are potatoes and other modern root vegetables anything like the tubers that would have been available to humans during the period of brain evolution in Africa?

A key question then, is whether Hardy’s assertions hold true for wild tuber species typical on the African ecosystem? Luckily we have some new data with which to test her hypothesis.

In a paper published earlier this year in the American Journal of Physical Anthropology Schnorr et al assessed the digestibility of wild tubers exploited by the Hadza foragers of Tanzania. After observing the Hadza methods of cooking and eating wild tubers they took samples to the lab where they simulated Hadza cooking, mastication and digestion in the mouth, stomach and small intestines. Their findings are interesting and challenge several of Hardy’s assertions. To show just how different such tubers are to modern cultivated root vegetables.

Consider how the Hadza eat these wild tubers:

First they roast them in open fires, one end at a time, turning them occasionally for a total of 5 to 15 minutes cooking. Next they peel them, then bite off a piece and chew it for half to three minutes (just pause a moment to consider that!) Then they spit out a wad of fibres called a quid. Depending on the particular tuber, the edible fraction varies from 20% to 80%, with the remainder being inedible fibre or peel. It is clear from this that African wild tubers are not remotely like any modern root vegetables – especially potatoes.


Hadza man roasting ekwa (Vigna frutescens) tubers – typical of those available during hominid evolution. Recent analysis by Schnorr et al found that only a small fraction of the peeled tubers glucose content (26 ± 8%) was available to digestion. Image courtesy Gary Aitken 2014

For the purpose of comparison, I’ve cobbled together some approximate available glucose values for the Hadza tubers and common supermarket tubers (in both cases fructose content has been excluded as both Hardy’s and Schnorr’s papers focus solely on glucose). The quantities for the Hadza tubers are based on 100g of edible tuber after peeling and discarding the fibrous quid, meaning that the glucose would actually be even more dilute than these figures suggest if they were for the whole tuber. You can see that these Hadza tubers do not come close to the carbohydrate density of modern starchy vegetables like potatoes, being closer to carrots.

Table 1. Available glucose and fibre content of Hadza v Modern tubers (g/100g)

Hadza tubers
Glucose Fibre* Modern tubers
 Glucose Fibre
(E. entennulifa)
2 10 Carrot 4 3
(V. pseudolablab)
2 15 Parsnip 9 5
(V. frutescens)
3 20 Potato 17 2
(I. transvaalensis)
8 1 Sweet potato 18 3

* Approximate fibre content of edible portion of Hadza tubers

What is clear from these data is that wild Hadza tubers are significantly lower in glucose and higher in fibre than modern root vegetables. The highest level of glucose in the Hadza tubers is comparable with parsnips – which are hardly ‘starchy’. Potatoes – which definitely are starchy – are the least similar to wild African tubers.

3. Does cooking significantly affect African tuber digestibility?

So what about Hardy’s hypothesis, that cooking tubers increased the bio-availability of glucose giving humans a boost in the brain development game? Schnorr et al found that although the Hadza roasted their tubers, this did little to improve digestibility. The main benefit of roasting seemed to be to assist with peeling. Of the four tuber species eaten by the Hadza two showed no change in glucose availability, in the other two the available glucose increased from 38% to 48%, whereas in the forth species it decreased, from 44% to 34%. This makes Hardy’s hypothesis look quite shaky.

Furthermore, the authors estimated that cooking and chewing only liberated 1/3 to 2/3 of the glucose content of the tubers and classified them as “very resistant to digestion”. By contrast, a modern baked potato requires only seconds of chewing and will be almost 100% digested, spiking the blood glucose levels dramatically within minutes of ingestion. Yet potatoes were used extensively in the news paper articles reporting on Hardy’s hypothesis.

It is worth considering what happens to the remaining 1/3 to 2/3 of the undigested starches in the Hadza tubers. These are not necessarily lost, as upon reaching the colon bacteria would get to work on them, converting them into easily absorbed short chain fatty acids – fats, that is. However, Hardy’s hypothesis emphasises tuber eating for a plentiful supply of glucose, not fatty acids, to fuel hominid brain expansion. What the Hadza study shows is that their tubers are a low-glycaemic food, in no way comparable to modern starchy tubers like potatoes. There is actually no contemporary food with which to compare them. We just don’t eat anything so fibrous.

A further interesting finding in the Hazda tuber study is summarised in this graph:


Schnorr et al, found that larger tubers – those with the highest levels of starch (bottom axis) – were the most resistant to digestion (vertical axis). They concluded that the human digestive system is simply unable to cope with large quantities of these wild tubers at one sitting.

This means that absorption of glucose may be inhibited by the higher glucose content, such as found in starch, since these glucose polymers can resist digestion or overwhelm the enzyme activity in the small intestine. Therefore, a “glucose”-rich tuber does not necessarily mean more glucose, proportionally, is directly available to the consumer.

If larger roots provide a lower percentage of available glucose Hardy’s hypothesis appears less tenable still, relying as it does on the widespread availability of starchy roots to fuel an increasingly metabolically-expensive brain. In contradistinction, Schnorr’s work suggests humans have a physiological limit on the quantity of wild tubers they can digest.

4. What is the significance of human salivary amylase adaptations

At some point in human evolution mutations took place resulting in multiple copies of the genes for salivary amylase, the enzyme that breaks down starch. It had been assumed that this took place during the neolithic switch to agriculture as an adaptation to the new staple – starch rich cereal grains. However, recent evidence indicates this adaptation goes back further in human evolution, although how far back remains a matter of debate.

Hardy et al suggest the multiple amylase gene copy number mutation arose in conjunction with human exploitation of cooked tubers.

We propose that after cooking became widespread, starch digestion became the rate-limiting step in starch utilization, and the coevolution of cooking and copy number variation (CNV) in the [amylase] gene(s) increased availability of preformed dietary glucose, permitting the acceleration in brain size increase observed from the Middle Pleistocene onward. – Hardy et al, 2015

This may or may not turn out to be correct. If so, however, it does little to rescue Hardy’s hypothesis, because, as we have seen these tubers have to be roasted, then masticated for minutes to extract minimal amounts of glucose. Rather than being a driving force for brain expansion, exploitation of tubers appears more like a diversification strategy – a valuable source of low-glycaemic calories that would have been helpful towards meeting the daily caloric needs when no better sources of nutrition were available, not the massive brain-fueling dose of glucose we associate with modern starchy vegetables.

Tubers are a reliable fallback food


Hadza food preferences, by sex. From Marlowe & Berbesque Tubers as fallback foods and their impact on Hadza hunter-gatherers. American Journal of Physical Anthropology, 2009

Harvey et al argue that tubers would have been a highly prized food by our ancestors, however this does not appear to be the case with modern hunter gatherers. For example the Hadza rely on five main categories of foods: tubers, berries, meat, baobab, and honey. In a 2009 study in the American Journal of Anthropology in 2009 Marlowe & Berbesque found that the Hadza rated tubers as the least preferred of their foods. They rated all other foods more highly than tubers.

As such tubers are seen very much as a fallback food – something to eat when there is insufficient of the good stuff, to top up calories, or to avoid starving.

The use of tubers as a  fallback food is also observed in primates, who will resort to collecting underground storage organs on land or from water plants when preferred foods are in short supply. (Savanna chimpanzees use tools to harvest the underground storage organs of plants, R. Adriana Hernandez-Aguila et al, 2007)

Schnorr et al, also found that on average tubers only made up about 20% by weight of the food brought back to camp each day. The caloric contribution of tubers is not well established, but is likely to be low. Indeed Schnorr and his team, found that the most commonly eaten Hadza tubers were 80-90% water by weight, indicating that they make only a small contribution to daily calories, perhaps as little as 5 or 10%. They suggest these watery tubers “may actually be more important for their moisture rather than caloric contribution”.

So where does this leave the hypothesis of Harvey and her team? Taking account all of the above factors, and realising that wild African tubers are nothing like the familiar spud, the idea that cooked starchy tubers drove human brain evolution starts to look far less plausible. Certainly, tubers have always been widely available, but, apparently not preferred, at least if the Hadza are anything to go by.

Their primary role as a food of last resort makes sense once their low-glucose, high-fibre nature is understood and the image of the modern potato is banished from one’s consideration. The effects of cooking on starches and the salivary amylase adaptations – assuming they did indeed arise early in human brain expansion – rather than providing evidence for a key driving force in human evolution begin to look like adaptations for survival in extremis, when higher quality foods were not available.

Instead of being the driving force behind human brain expansion, Hardy et al‘s hypothesis might well come to be seen as little more than a rather peripheral aspect of dietary adaptation.

An alternative hypothesis on the role of wild tubers on human brain evolution

As you can see from the graph at the top of this post, human brain expansion began two million years earlier than Hardy’s date of 800,000 years ago. Isotopic evidence from that period shows that early humans (Australopithicus) had ‘C4 plants’ in their diet which primarily indicates grasses and sedges (Lee-Thorp J et al, 2012).
Water_chestnut_plantThis is distinct from other primates at the time, which were eating mainly C3 plants – indicative of leaves, nuts and fruits from forests. During this period a dryer climate was creating more grassland and early man seems to be taking advantage of this new savanna environment. However, there are two ways the C4 isotope could appear in Australopiths – either they were eating grasses directly, or they were eating herbivores that ate those grasses. Quite possibly both of these took place.

What is not likely, however, is that early humans ate the foliage or seeds of grasses – these being virtually indigestible – but rather they were eating the underground storage organs (bulbs and tubers) of various species of sedge (Laden G, Wrangham R, 2005). Sedges grow in and around water, and often have tubers which can easily be pulled from the soft mub at the bottom of ponds or river sides (see image, right).

Although small, they can be collected year round and, importantly, can be eaten raw. This makes tuber eating a possible factor in early human brain expansion, prior to the domestication of fire.

You may already have eaten such tubers yourself. The most common culinary varieties being water chestnuts (Eleocharis dulcis) – popular in oriental cooking – and, less well known, tiger nuts (Cyperus esculentus).

Water chestnuts and tiger nuts are quite high in digestible carbohydrates and fibre, and along with similar species certainly present a more realistic food source for early humans. Indeed it has been estimated that just 150-200g of tiger nuts contains sufficient essential fats to satisfy human needs (Nathaniel J. Dominy, ‘Hominins living on the sedge’ 2012). The fact that other primates make use of them as fallback foods suggests it is unlikely that on their own they could account for the unique evolutionary force that drove human brain expansion over the subsequent 2.5 million years, but it may have contributed to getting the process started.


Namibian boy gathering frogs for cooking.

A much more exciting hypothesis, is that early humans specialised in water resources of all kinds. This would have provided fish and shellfish, along with water tubers from sedges and cyprus plants.

Broader water resource utilisation may also explain why humans became bipedal – to wade (Kuliukas, 2002). Whilst sedge tubers may have provided glucose to fuel the early brain, we have seen that this is not necessarily a pre-requisite for brain expansion as gluconeogenesis and ketogenesis can provide all the fuel a human brain needs. More important for brain development is access to sources of brain-building fatty acids and minerals (‘brain selective nutrients’) such as iodine, iron, zinc, selenium and omega-3 fatty acids (especially DHA) – dietary components that in all other land mammals place significant limits on the relative size of their brains.

A shore-based diet, i.e., fish, molluscs, crustaceans, frogs, bird’s eggs and aquatic plants, provides the richest known dietary sources of brain selective nutrients. Regular access to these foods by the early hominin lineage that evolved into humans would therefore have helped free the nutritional constraint on primate brain development and function.
– Cunnane & Crawford, 2014

Addendum. Why tubers are OK as part of an Ancestral or Paleo diet.

The arguments above are to illustrate that the evolution of the human brain probably did not depend on access to cooked starchy tubers as Hardy et al have claimed.  However, I do not believe there is anything wrong with eating root vegetables, and I would recommend them as part of a healthy modern Paleo diet, especially with some tiger nuts and water chestnuts thrown in. Our supermarket root vegetables are still real, unprocessed, foods – much closer to wild tubers than virtually any other modern carbohydrate source. Selective breeding has produced roots that are far more appetising and less fibrous than their wild counterparts, so we may see them as somewhat better than a fallback food. That said, I would have to be in dire straights before trying to survive on carrots alone, and although I like my meat and two veg, I guess my preference chart would not look too dissimilar to the Hadzas’!


Villagers gather to share their crop of sweet potatoes which have been grown for hundreds of years in the highlands of Papua New Guinea

Tubers such as sweet potatoes and taro are staple foods of some very healthy primitive agriculturalist tribes such as the Kitavans, Tukisenta and the highlanders of Papua New Guinea. These people eat high carbohydrate diets, based around starchy tubers but suffer none of the western diseases of civilisation when they stick to their traditional diets. However, these examples do nothing to rescue Hardy’s hypothesis as the agricultural practices necessary to grow such crops have only been around for at most 10,000 years. Without cultivation, such tubers could only provide a minor part of the diet. That said, they do point to the ability of humans to thrive on a high carbohydrate diet where these carbs are primarily from unrefined starchy vegetables. The key difference between such ancestral diets and modern western diets may well be the dominance of foods based on refined grains which have far higher carbohydrate densities, as illustrated in the charts below.

The bottom line is, ancestral and paleo foods have always included tubers, root vegetables and bulbs. However, it seems likely that other dietary factors were responsible for the evolution of the unique human brain, and we would do well to pay attention to those foods first: fish, shellfish, eggs… and frogs!

August News Round-Up

SHOCK REVELATION: NHS hands out Gluten-free Junk Food on prescription

The biggest laugh of the month has to be the widely circulated story that the NHS is providing gluten-free junk food (cakes, donuts, pizza) on prescription. According to the Telegraph (August 17th),  “One GP called the measures “irresponsible”, claiming some patients were providing a “shopping list” to feed their whole familes.”

Inevitably, backlash from irate coeliac sufferers followed almost instantly. “Patient groups defend NHS spending on gluten-free food for sufferers”, The Independent declared (August 17th).

As I’ve always argued, so called gluten-free products are part of the problem, not part of the solution. Although it is essential that coeliacs have access to gluten-free food they would be far better off adopting a truly grain-free diet, based around real foods such as meat, fish, nuts, vegetables and fruit, rather than the highly processed gluten-free simulacra. The NHS deserves ridicule simply for its failure to promote real food.

Call to switch focus from calories to nutrition to cut CVD

On the theme of real food, the Nursing TImes (August 27th) report on a paper by doctors Aseem Malhotra & Simon Capewell who say evidence shows that poor diet is consistently responsible for more disease and death than physical inactivity, smoking and alcohol put together. I like that!

In an editorial piece in BMJ’s Open Heart they argue that a move away from sugary drinks and towards regular consumption of fish, nuts and olive oil produces cardiovascular benefit in months. In fact they receive the accolade of…

Quote of the month

“Shifting the focus away from calories and emphasising a dietary pattern that focuses on food quality rather than quantity will help to rapidly reduce obesity, related diseases, and cardiovascular risk,”
Malhotra & Capewell, BMJ Open Heart


Giving the real-food movement a shot of celebrity TV drama, Jamie Oliver is on the war path as he attempts to argue the case for a sugar tax. According to The Independent (August 27th) in his upcoming TV series Jamie will meet surgeons removing children’s sugar-rotted teeth and performing amputations on diabetes sufferers. The surgeons warn that the NHS will “crumble” due to the accumulating cost of treating sugar-related outcomes.

Sugar as a public health concern is a cause who’s time has come. It’s easy to understand, sufficient scientists have rallied against it, the public are largely on board, and it’s got the ear of politicians. Whilst, of course, I applaud the general direction of this debate I am always wary when an issue becomes a mass media movement. The media wants a simple story, without the subtleties inherent in the science.

An example of this is a study reported in Diabetes in Control (August 20th) that found vegetable oils (corn and soya oil), caused greater obesity and diabetic symptoms in mice than fructose (sugar), whilst highly saturated fats (coconut oil) caused the fewest symptoms. This didn’t make the main news outlets as it’s off-message. You can see the problem can’t you? Who’s going to break the news to Jamie?

Saturated Fat

The Telegraph (August 11th), in a highly cited article, headlines:

Butter unlikely to harm health, but margarine could be deadly

Although traditionally dieticians have advised people to cut down on animal fats, the biggest ever study has shown that it does not increase the risk of stroke, heart disease or diabetes.

Yes, we know that. But good to see the message is gradually getting out. The ‘margarine could be deadly’ bit refers to trans-fats which were indeed in the original margarines of the 1960’s and 70’s. What people casually refer to as margarines now are ‘vegetable spreads’ which don’t contain trans-fats, but instead those vegetable oils we just heard may cause mor obesity than sugar does. But hey ho! – ‘deadly margarine’ makes a good headline!

A more scientific review of this study can be found on Medpage Today and includes some interesting comments from researchers.

Best butter recipes

These recipes also c/o the Telegraph make good use of butter, and are grain-free:

Fish oils and mental health

The Guardian (August 11th) reported on a recent small study which showed that Omega 3 fish oils could prevent schizophrenia among at-risk young people. There was a marked reduction in incidence after 7 years among the group that was supplemented for only 3 months. Larger trials are called for to confirm the results.

The authors speculate that the timing of the intervention may be critical ― during adolescence and before conversion to psychosis, when the neurodevelopment in brain regions relevant to schizophrenia occurs.
MedScape Today (August 20th)

Meanwhile, The Telegraph (August 25th) reports on a large study that found that fish oils failed to slow cognitive decline in the elderly.

These results make some sense considering – as explained in our fish talks – that omega-3 oils are critical during brain development. Schizophrenia tends to emerge during late teen years as the brain matures, so more omega-3s at this critical point makes sense. For the elderly, the supplementation started 70 years too late!

Iodine in pregnancy

Linked to fish, is the issue of iodine. Based on earlier studies that identified the UK population as mildly to moderately iodine deficient, a new study has modelled the benefits of iodine supplementation during preganancy suggesting it could be cost effective simply through the national increase in child IQ. BBC News (August 10th) covered it well, whilst NHS Choices gave an in-depth analysis, and, to their credit, recommend good dietary sources of iodine (fish, milk, seaweed) rather than pill popping. So maybe I should be lenient on the NHS after all, as they certainly have this advice correct.

Vitamin D and Multiple Sclerosis

The relationship between low vitamin D status and incidence of multiple sclerosis (MS) is long standing and has been seen time and again in population studies. Observational associations do not, however, prove causality: it is quite possible that a third factor could be driving both low vitamin D and MS for instance.

A new study, reported in The Science Times (August 28th), however, takes us one step closer to an answer. Researchers looked at genes that limit vitamin D synthesis in people and found they were more common among MS patients. As these genes are inherited randomly and are not influenced by environmental factors, the results suggest that low vitamin D is a causative risk factor. Medpage Today (August 31st) goes into the methodology in a bit more detail.

Paleo News

Click to view The Guardian article

This beautiful image comes from The Guardian (August 18th) showing a wonderful range of paleo diet foods. So where’s the story in this?

A paper published this month has argued that the evolution of the human brain depended on access to tubers (like the sweet potato above) as well as meat. Apparently this is news, with The Guardian running the headline: “What Paleo diet experts think – and why they’re wrong“.

The problem with the media is that they always want a bite-size story. There has never been any doubt that the original paleolithic diet included tubers, along with nuts, fruit, leaves, molluscs, shellfish, elephants*, insects… i.e. real foods that can be hunted or gathered. Indeed studies of modern hunter-gatherers have provided evidence for the paleo-diet. Here is some of what these studies say:

  1. The consumption of carbohydrates varies enormously with geographic location
  2. On average carbohydrate consumption is lower than standard western diets whilst protein consumption is higher
  3. Carbohydrate consumption typically comes from tubers, fruit and nuts – but virtually never from grains
  4. Hunter gatherers almost always prize hunted game above gathered tubers which are very much seen as a second rate, fallback food.

Another thing the news papers have failed to point out clearly is that this paper includes no new data – it is simply presenting a hypothesis making an argument for tubers as a source of glucose for brain fuel. It’s a useful, if not entirely original, contribution to the discussion, but didn’t warrant the excitable headlines that most news outlets employed.

The paper that provoked these headlines (The Importance of Dietary Carbohydrate in Human Evolution) deserves a detailed analysis, but that will have to wait for another time.

Humans had a taste for elephants

In the previous section you may have noticed I cheekily slipped in elephants among the list of paleo foods? Even though we all associate cave men with eating woolly mammoths, we often forget that for much of human evolution African elephant was on the menu.


The Mail Online (July 2nd) had a great article covering recent research about modern tribes who hunted elephants.The researchers studied the taste preferences and hunting behaviour of several ethnographic hunter gatherer groups. They found:

  • Historically, tribes living in East Kenya, such as the Liangula, hunted elephants for meat and particularly preyed upon juveniles because their meat was said to taste better.
  • The Mbuti Pygmy people of Zaire particularly cherish the bone marrow of elephants.
  • The Nuer people of Southern Sudan hunt elephants illegally as they consider the meat to be a delicacy. They describe the flesh as tasting sweet and fat.
  • Historical texts by Western scientists described the taste of elephant meat as being ‘delicate’, ‘tender’ and ‘sweet’.

Take a look at the rest of the article which covers the archaeological evidence of similar consumption patterns going back 400,000 years!

I bet elephant burgers tasted good with roast yam chips. Wait… there’s a T-shirt for that…


P.S. Before I get complaints from people that don’t recognise a joke… I’m not actually advocating eating elephants! Or yam chips!