How can bitter foods be good for us when they taste so bad? – Resolving the paradox

Laying out the problem

Our recent post on bitters, left me with a lot of questions.

If bitter tastes indicate the presence of toxins and thereby help us avoid poisonous foods, why do they stimulate such positive physiological responses? Why would some of those responses protect us from metabolic diseases like diabetes and cancer? If bitter taste is merely a warning to avoid a particular food, then why do many traditions revere bitter foods? How do we explain why adults develop a taste for bitter foods that as children they found repulsive? Why does folk law say “Good medicine always tastes bitter”?

After a lot of pondering I think I’ve got an answer but to make sense of it I need to lay out what I see as the relevant parts of the puzzle first.

Read time: 16 minutes (3100 words)

Continue reading

Paleo veggies (video and infographic)

MODERN WILD FOOD GATHERING. When you know what to look for there are plenty of edible wild plants out there. How do they fit into a paleo diet?

There have recently been a number of articles making pronouncements on the original paleo diet, as eaten by our paleolithic ancestors. At the end of last year, December 2016, we had…

  • Ancient leftovers show the real Paleo diet was a veggie feast (New Scientist)
  • Secrets of the paleo diet: Discovery reveals plant-based menu of prehistoric man (Eureka Alert)

Then in March this year…

The more recent articles appeared following a paper by Laura Weyrich et al. published in Nature, March 2017, titled Neanderthal behaviour, diet, and disease inferred from ancient DNA in dental calculus. Afifah has written a post about the herbal medicines these Neanderthals were using, and we are going to publish a guest post addressing the vegetarian claim shortly.

The December articles on paleo veggies were prompted by an Israeli study (Melamed et al. PNAS) which identified the remains of a wide variety of plant food remains in a cave in the Levant (modern Israel). The 780,000 years old remains are unusual as plant materials are rarely preserved at such sites, so this paper provides some insight into the plants resources used by our ancient ancestor homo erectus.

The remnants include no fewer than 55  different species including roots, seeds, nuts, fruit and leaves. Many of these resources were seasonal and some required simple processing and cooking. Here is the New Scientist Video that accompanied their article, which, I think you will agree, has a touch of the Blue Peter about it:

You would think from the headlines that evidence that our ancestors ate a wide variety of plant foods is new or somehow overturns Paleo Diet thinking. The media portrays the Paleo Diet as consisting largely of red meat, but that is wrong. Since its inception, proponents of the modern Paleo Diet such as Professor Lauren Cordain have argued that we should be eating more like hunter-gatherers. That has always meant both the gathering part (eating plant foods) as well as the hunting bit (eating animal products).

What is strange about the recent media pronouncements is that the research that stimulated them is perfectly in accordance with Paleo Diet principles. It seems as if the media are spinning these stories for the sake of headlines, which makes them, in the lingo of the day, fake news does it not?

Paleo Veggies

A careful reflection on the details of the foods identified in the Melamed study suggest a number of subtle paleo principles we might all like to take on board:

1. Increase the range of plant foods eaten

Few of us eat as wide a variety of plant foods as these ancient hominids. Modern hunter gatherers also tend to eat a far wider range of plant foods than typical modern humans. Not only does eating a range of plant foods increase the range of phytonutrients ingested, but it also reduces the exposure to the anti-nutrients found in any one plant source.

2. Eat seasonal food

This is really part of eating a wider range of plant foods and means we give our body a rest from any anti-nutrients when that food is out of season. Another plus is that seasonal foods can be higher in nutrients than those that are grown out of season under artificial light: that’s why winter tomatoes and early season strawberries often taste so insipid (taste being evidence of nutrients. Read ‘The Dorito Effect’ for more info on this amazing area of science).

3. Grow your own

Our ancient ancestors couldn’t preserve foods by canning and freezing. The freshest foods you can eat are those that you have just picked from your own garden, minutes before eating them. Here at Rosemary Cottage we grow a lot of our own fruit, berries, and veg (in fact we have a blog just about this here). They are packed with flavour and much higher in nutrients than supermarket varieties which are often picked under-ripe and have sat around for a couple of days on the shelves or have been flown half way round the world in a low oxygen ‘protective atmosphere’. If you haven’t got a garden or allotment you can buy living salads, mustard and cress, or growing herbs which provide the same fresh-food benefits.

4. Eat wonky, small, damaged  and organic veggies

Studies have shown that fruit and veg that have been exposed to harsh environmental conditions often have higher levels of nutrients as these compounds are primarily plant defence compounds. The perfect, class 1 fruit and veg we are offered in the supermarket have been overly pampered, sacrificing nutrients for looks. Many of the phytonutrients in veggies are concentrated in the colourful skins. Cherry tomatoes therefore pack more nutrients per kilo than their larger cousins as they have a larger surface are to volume ratio. Another benefit is that buying wonky veg increases farm profits and reduces food waste.

Organic fruit and veg tends to be less perfect, more blemished, usually class 2. Possibly for this reason they often have higher levels of nutrients (European Journal of Clinical Nutrition)

5. Include close-to-wild foods

It is a fact that many modern fruit and veg have been bred to increase water, sugars and starches and to be less bitter or sour – all of which has diluted the phytonutrients. Consequently, some of the most nutrient dense plant foods are those that have had the least selective breeding such as the following.

  • Leaves: Water cress, rocket, parsley, purslane, coriander leaf, miners lettuce, samphire, seaweed, tea
  • Fruit: Blueberries, red and white currents, blackberries, raspberries, alpine strawberries, olives, capers, sour cherries
  • Seeds: All nuts and seeds, coffee
  • Roots: Salsify, scorzonera, oca, pink-fir apple potatoes, water chestnuts, tiger nuts
  • Shoots: Sprouted seeds, mustard and cress, bamboo shoots, asparagus, sprouting broccoli
  • Flowers: Artichokes, borage, nasturtium, calendula

Many of the above need only be eaten in relatively small quantities as it is often the toxins in these plants that stimulate our immune system, so you don’t want to over do them. (See our post: The chemical warfare on your plate). For example, health benefits of tea and coffee seem to peak at 4 to 5 cups per day and the benefits of tree nuts levels off at 30g per day. In some cases over doing it can actually lead to harm: for example spinach, which if consumed every day can lead to kidney stones due to its high oxalic acid content. Daily juicing of spinach is therefore unwise, despite ‘green smoothie’ proponents waxing lyrical about it. (Read here about some of the problems with oxalates)

I’ve made a nice info-graphic of some wild-like foods you might want to try. Although they are not always easy to come by I have seen all of these in supermarkets or farmers markets over the last year or so. I have several of them in my current garden, and have eaten all of them at one time or another. How many have you tried?

Final thoughts

A little thought about the paleo veg principles above makes it clear why paleo veganism must have been a rare or intermittent occurrence. Few paleo veggies contain sufficient calories to sustain life, and due to their anti nutrients eating them in large quantities or for prolonged periods could easily lead to problems. Furthermore, the wild foods that are sufficiently high in calories (nuts, seeds and some tubers) would need to be available in quantity, year round, or starvation would be a very real risk. Changing availability and seasonality mean it is unlikely our ancestors were vegans for extended periods, although there would no doubt have been times when animal food sources were limited and they would have been forced to get by on plants alone. In short – humans are and have always been highly adaptable omnivores.

Neanderthal Herbal Medicine

Our closest, extinct, cousins the Neanderthals are often thought of as thuggish and unsophisticated, but evidence over the last decade has began to challenge this picture, indicating that they had a broad range of skills, knowledge and, yes, sensitivity.

There is a lot of evidence from bone assemblages that Neanderthals often behaved as top predators, hunting a wide range of animals including deer, rhinoceroses, bisons and even brown bear. In this pursuit they were highly skilled and more successful than hyenas with whom they competed, indicating a high level of strategic intelligence and cooperation.

  • Read more about Neanderthal hunting prowess here: ScienceDaily

As well as a good knowledge of animal behaviour Neanderthals also used botanical material. Skeletons excavated in the 1950’s from Shanidar cave in northern Iraq indicate that Neanderthals buried their dead with flowers. These skeletons also showed evidence of injuries that had been tended and healed indicating that the sick and wounded had been cared for effectively. Continue reading

The Waterside Ape on BBC Radio 4


In case you missed it, this is an excellent introduction to the controversial theory that humans evolved in riverside / shoreline environments. Two programmes include contributions from Stephen Cunnane. RECOMMENDED.

GUEST POST from Miki Ben-Dor: “Big brains needed carbs” (???)

Here is the third in our series of posts considering the paper by Hardy et al, that made so many headlines in the media.This time we are pleased to feature a post by PhD candidate Miki Ben-Dor from Tel Aviv University who brings his expertise of paleo-anthropology to the fore and questions Hardy’s ideas that cooked starches promoted rapid brain growth 800 thousand years ago.

“Big brains needed carbs” (???)

Big Brain Need Carb is the title of the press announcement that accompanied the publication of a paper by Hardy et al. that was slightly more mutely titled “The Importance of Dietary Carbohydrate in Human Evolution” (1).

Miki Ben-Dor | Tel Aviv University

Miki Ben-Dor researches the connection between human evolution and nutrition throughout human prehistory. His primary Paleo blogging is directed to a Hebrew reading audience, so we are pleased to share his English work further. His paper Man the fat hunter… in PLoS One, 2011, is a favourite of ours. His English blog Paleostyle is here.

I don’t think that big brains need carbs or that, until recently, carbs were important in human evolution so here is an initial rebuttal.

Several camps can be identified among those who think that it was indeed the diet that made us human:

  • Stanford and Bunn’s “Hunting and meat” camp (2, 3),
  • Wrangham’s “Cooking and plant food” camp” (4, 5),
  • Anton, Aiello and Ungar’s “Dietary flexibility” camp (6, 7),
  • Crawford’s “Omega 3″ camp (8)
  • My lonely “Man the fat hunter” hut (9) at the corner of the “Hunting and meat” camp.

Hardy et al.’s hypothesis is a variation on Wrangham’s cooking hypothesis which claim that cooking of starch and meat 1.8 million years ago allowed an enlarging brain and thus the evolution of the Homo species. Hardy et al. push that date 1 million years to 800 thousand years ago (kya).

Hardy et al. argue that multiplication of the salivary amylase AMY1 gene coincided with the invention of cooking 800 thousand years ago (kya) in order to facilitate a large consumption of carbohydrates that a growing brain needed. They conclude their abstract so: “Although uncertainties remain regarding the antiquity of cooking and the origins of salivary amylase gene copy number variation, the hypothesis we present makes a testable prediction that these events are correlated“.

I will try to test the correlation in time between cooking and the multiplication of the AMY1 gene later but let’s start with a test of whether any of the two proposed events (AMY1 multiplication, cooking) happened 800 kya when, as Hardy et al. point out, the rate of increase in brain size accelerated.

Chimpanzees, with only two copies of the AMY1 gene, don’t eat much starch and therefore have no need for large amounts of salivary amylase. Some modern humans that eat plenty of starch, as Perry et al. found out (10), have on average a higher number of copies than those who eat little starch. Additionally, there is an inverse relationship between the number of copies people have and their chances of becoming diabetic or obese (11-13) so a high number of copies of the AMY1 gene seems to be a good indicator for genetic adaptability to the consumption of large quantities of starch.

So when did we start to accumulate more copies of the AMY1 gene? Hardy et al. state that although the exact date is unknown it is thought to be less than 1 million years ago. In a 2014 paper, which Hardy et al. do not cite, Perry et al. (14) concluded that the duplication event occurred after the split between H. sapiens and Neandertals 550-590 kya. Furthermore they raise the possibility that the multiplication of the genes occurred within the past 200 thousand years but prior to the origin of agriculture based on similarity between the AMY1 copies. So in summary it is quite likely, based on AMY1 studies, that humans began to consume large quantities of carbohydrates not 800 kya but after 550 kya and more probably between 200 kya and 12 kya after their brain already reached its maximum size.

But what about cooking? Was cooking prevalent 800 kya and was the beginning of cooking necessarily associated with increased starch consumption?

To archaeologists, the control of fire is indicated by the existence of hearths, preferably with burnt bones in them and stone tools around them. There is no good evidence for hearths prior to 780 kya anywhere in the world. At 780 kya there is a single claim for habitual use of fire at Gesher Benot Ya’aqov (GBY) in the Jordan valley, Israel (15). The problem with this site is that it is situated in an area that went through extensive volcanic activity and lava flows at that time. One such flow ran in the middle of the site and was actually used to date the site to 780 kya. One cannot rule out the possibility, therefore, that existing concentrations of organic material were burnt by the heat of the lava and formed a hearth like artifact. In any event, the evidence for a habitual use of fire at that period in hundreds of other sites in the world is null. In my opinion, the most important recent (2014) paper on the subject is Shimelmitz et al. (16). It is important because it examines a cave (Tabun in Israel) that was inhabited by three successive cultures from 800 kya to 100 kya, and you do not find many complete sequences like this. By examining bones and stone tools from all of the cave layers Shimelmitz et al. concluded that fire was not used in the cave prior to 350 kya but was used continuously ever since. In summary people at Tabun cave from the same culture (Acheulian), and the same period (800 – 350 kya) and only 80 km away from the people of GBY had not used fire at all, habitually or not.

This result ties perfectly with evidence from Qesem cave in Israel, which also provides a date of 350 kya for the extensive use of fire by people from the same subsequent culture (Acheulo-Yabrudian) to the one in GBY  (17).

Ok’ so not 800 kya but did the AMY1/cooking presumed duo take place around 350 kya?

First of all, it must be said that by 800 kya the size of the human brain had already doubled, compared to Australopithecus, from 400 cc or so to approximately 800 cc. Following Hardy et al.’s hypothesis, humans didn’t really eat plenty of carbs until 800 kya. Since we are limited in our ability to process protein to energy, the only possible logical conclusion is that prior to 800 kya the brain received its sugar from protein by way of gluconeogenesis and that a large part of the rest of the energy was supplied by fat. The brain can live nicely on glucose from gluconeogenesis or from ketone bodies that the liver produces from fat. See the Maasai or the Inuit. In other words, if we accept Hardy et al.’s hypothesis, high protein high fat diet is the only possible diet during the first million plus years of human evolution.

I do not claim that Paleolithic humans ate only protein and fat. Hardy is an expert in finding plant residue in dental calculus and she indeed found them as early as 350 kya in Qesem Cave (18). However, as far as finding the proportion of plant food in the diet goes, there are other methods, mainly using isotopes.

I looked at Hardy et al. for references to isotope research and found this: “…stable isotope analyses indicate a mainly carnivorous diet for Neanderthals; a wider range of isotopic values have been observed in contemporary Middle Pleistocene H. sapiens (Richards and Trinkaus 2009), indicating that considerable differences in the levels of starch consumption existed between these two species”.

I was a little puzzled as I have cited this paper (19) in the past and could not recall any reference to starch in the diet of the examined population. The paper compared N isotope samples from 10 modern European humans from around 40-30 kya and 5 Neandertals. To cut a long story short here is a quote from the paper summary:

“There are now enough isotopic data to see patterns in the data, and they show that the Neanderthals and early modern humans had similar dietary adaptations, obtaining most of their dietary protein from animals…”. Hmm…1500 cc brains (of modern humans and Neandertal) obtaining most of the dietary protein, which comes with plenty of fat, from animals. Just to be sure – later, as we get close to agriculture and archaeological evidence indicate increased carbs consumption, isotope studies do pick it up very nicely (20).

So we are 900 cc brain humans that need to grow to 1500 cc and the extra 600 cc brain looks for energy. As an economist by training I see a substantial problem with carbs as economical solution to the problem. It transpires that collecting carbs by way of tubers and preparing them for consumption is around ten times less energy efficient when compared to hunting (21). With the men doing the hunting that provides glucose for the first 900 cc of brain volume, the collection and preparation of tubers must be done by the women, who can’t go hunting since they have to guard the young. Plant food is also seasonal and not necessarily found in the same patches where animal are found. In summary, it is difficult to see how carbs could be the solution. More fat, obtained more efficiently, seems to me to be a much more plausible solution. Presumably, a larger brain allows for smarter tracking of animals, which means reduction in locomotion needs per pound of flash and thus achieving the extra energy more efficiently.

There is plenty of other evidence that humans were basically carnivores until 30-20 kya, just prior to the agricultural revolution. Some of the evidence is shown in my AHS13 presentation which can be found here. If true, the evidence points to a 300 ky difference between the control of fire and the consumption of large quantities of starch, and hence the addition of multiple copies of AMY1, so synchronicity between the two also seems quite unlikely.

Be well

Miki Ben-Dor

1.           Hardy K, Brand-Miller J, Brown K, et al.; The importance of carbohydrates in human evolution. The Quarterly Review of Biology 2015;90(3):251-267.

2.           Dominguez-Rodrigo M, Bunn HT, Mabulla AZP, et al.; On meat eating and human evolution: A taphonomic analysis of BK4b (Upper Bed II, Olduvai Gorge, Tanzania), and its bearing on hominin megafaunal consumption. Quaternary International 2014;322:129-152. doi: 10.1016/j.quaint.2013.08.015.

3.           Stanford CB, Bunn HT. Meat-eating & human evolution: Oxford University Press Oxford, 2001.

4.           Wrangham RW, Jones JH, Laden G, et al.; The raw and the stolen. Current Anthropology1999;40:567-594.

5.           Wrangham R, Carmody R; Human adaptation to the control of fire. Evolutionary Anthropology: Issues, News, and Reviews 2010;19(5):187-199.

6.           Antón SC, Potts R, Aiello LC; Evolution of early Homo: An integrated biological perspective.Science 2014;345(6192):1236828.

7.           Ungar PS, Grine FE, Teaford MF; Diet in early Homo: a review of the evidence and a new model of adaptive versatility. Annu. Rev. Anthropol. 2006;35:209-228.

8.           Cunnane SC, Crawford MA; Energetic and nutritional constraints on infant brain development: implications for brain expansion during human evolution. Journal of human evolution2014;77:88-98.

9.           Ben-Dor M, Gopher A, Hershkovitz I, et al.; Man the fat hunter: the demise of Homo erectus and the emergence of a new hominin lineage in the Middle Pleistocene (ca. 400 kyr) Levant. PLoS One 2011;6(12):e28689. doi: 10.1371/journal.pone.0028689.

10.         Perry G, Dominy N, Claw K, et al.; Diet and the evolution of human amylase gene copy number variation. Nature 2007.

11.         Carpenter D, Dhar S, Mitchell LM, et al.; Obesity, starch digestion and amylase: association between copy number variants at human salivary (AMY1) and pancreatic (AMY2) amylase genes.Human molecular genetics 2015;24(12):3472-3480.

12.         Falchi M, Moustafa JSE-S, Takousis P, et al.; Low copy number of the salivary amylase gene predisposes to obesity. Nature genetics 2014;46(5):492-497.

13.         Mandel AL, Breslin PA; High endogenous salivary amylase activity is associated with improved glycemic homeostasis following starch ingestion in adults. The Journal of nutrition2012;142(5):853-858.

14.         Perry GH, Kistler L, Kelaita MA, et al.; Insights into hominin phenotypic and dietary evolution from ancient DNA sequence data. Journal of human evolution 2015;79:55-63.

15.         Alperson-Afil N; Continual fire-making by hominins at Gesher Benot Ya ‘aqov, Israel.Quaternary Science Reviews 2008;27:1733-1739.

16.         Shimelmitz R, Kuhn SL, Jelinek AJ, et al.; ‘Fire at will’: The emergence of habitual fire use 350,000 years ago. Journal of human evolution 2014;77:196-203.

17.         Shahack-Gross R, Berna F, Karkanas P, et al.; Evidence for the repeated use of a central hearth at Middle Pleistocene (300 ky ago) Qesem Cave, Israel. Journal of Archaeological Science2014;44:12-21.

18.         Hardy K, Radini A, Buckley S, et al.; Dental calculus reveals potential respiratory irritants and ingestion of essential plant-based nutrients at Lower Palaeolithic Qesem Cave Israel. Quaternary International 2015.

19.         Richards MP, Trinkaus E; Isotopic evidence for the diets of European Neanderthals and early modern humans. Proceedings of the National Academy of Sciences 2009;106(38):16034-16039.

20.         García-González R, Carretero JM, Richards MP, et al.; Dietary inferences through dental microwear and isotope analyses of the Lower Magdalenian individual from El Mirón Cave (Cantabria, Spain). Journal of Archaeological Science 2015.

21.         Stiner MC, Kuhn SL; Paleolithic Diet and the Division of Labor in Mediterranean Eurasia. In:Hublin J-J, Richards MPs (eds). The Evolution of Hominid Diets: Integrating Approaches to the Study of Palaeolithic Subsistence: Springer, 2009, 155-168.



Did cooked tubers drive human evolution?

Following last weeks post on the misrepresentation of the Paleo diet in the press, it’s time to examine in more detail the claims of the scientists that sent the newspapers into such a feeding frenzy.Brain_size_evolution

The recent paper entitled “The Importance of Dietary Carbohydrate in Human Evolution.” by Hardy et al, argues that rapid expansion of the human brain 800,000 years was fueled by the consumption of cooked starchy tubers.

The paper’s hypothesis is based around four key arguments:

  1. Brains need glucose, and starchy foods are an abundant source of glucose
  2. Starchy tubers are a reliable food resource available year round in Africa
  3. Cooking transforms starches in tubers making them far more digestible, and
  4. Humans are adapted to eating starches. Unlike other primates, they carry multiple copies of the salivary amylase genes.

I’ll look at each of these four points in turn.

1. Do brains ‘need glucose’?

That ‘brains need glucose’, is of course physiologically true. The brain is indeed dependent on a steady supply of glucose. The problem with this oft quoted factoid is that it is used to support a seemingly logical, but false conclusion:‘…so we need to eat carbohydrates’. As Hardy’s paper acknowledges, the human body is perfectly capable of manufacturing all of the glucose it needs from fats and proteins by the process of gluconeogenesis (literally making new glucose). This process is ramped-up whenever carbohydrate intake is limited. How else would it be possible for humans to survive famine for weeks on end if their brains were dependent on a constant supply of external carbohydrates as a glucose source? To get round this inconvenient truth Hardy et al argue that gluconeogenesis is inefficient, so humans would have preferred concentrated sources of glucose from starchy tubers. This may be a valid argument, but to my mind suffers from several weaknesses, especially for explaining brain evolution, in part because a very low-carb (‘ketogenic’) diet has been shown time and again to protect the brain.

A ketogenic diet is one in which carbohydrates intake is severely restricted (e.g. to less than 30g carbs per day). In such circumstances, which mimic starvation, the body not only manufactures its own glucose at precisely the appropriate amount, but also produces ketones by breaking down dietary or body fat. This process kicks in during starvation, fasting, or when an individual eats a very low-carb /high-fat ketogenic diet.

The brain is perfectly happy to run on a 50/50 mix of ketones and glucose during such times. Indeed this reduced glucose state appears to provide multifactorial neurological protection. Indeed a ketogenic diet is one of the primary treatments for epilepsy and is currently under trial as an adjunct for cancer therapy (especially glioblastoma).

Here are just a few of many papers on the subject if you want to look into it in more depth:

With a low-carb, low-glucose, ketogenic state proving itself so spectacularly protective in neurological problems and for recovery from brain injury, trauma and stroke, is it likely that a high-carb diet was a key driver of brain evolution?

Another point against Hardy’s hypothesis is the unique nature of infant birth – our large brained babies have a layer of subcutaneous fat, unique among primates, that provides ketones for brain fuel before, during and after birth. (Cunnane & Crawford, 2014)

2. What are African wild tubers really like?

One of Hardy’s central arguments is that cooking significantly increased the digestibility of starch-rich tubers, releasing more glucose for brain evolution. By way of example she states that cooking potatoes increases the digestibility of the starch ‘by up to twenty fold’. That’s impressive, but potatoes and other modern root vegetables have undergone artificial selection to create the easily digestible varieties we know today. Potatoes are indigenous to the Americas, and are toxic if eaten raw, so are probably not a good model of the tubers available during hominid evolution in Africa.

Are potatoes and other modern root vegetables anything like the tubers that would have been available to humans during the period of brain evolution in Africa?

A key question then, is whether Hardy’s assertions hold true for wild tuber species typical on the African ecosystem? Luckily we have some new data with which to test her hypothesis.

In a paper published earlier this year in the American Journal of Physical Anthropology Schnorr et al assessed the digestibility of wild tubers exploited by the Hadza foragers of Tanzania. After observing the Hadza methods of cooking and eating wild tubers they took samples to the lab where they simulated Hadza cooking, mastication and digestion in the mouth, stomach and small intestines. Their findings are interesting and challenge several of Hardy’s assertions. To show just how different such tubers are to modern cultivated root vegetables.

Consider how the Hadza eat these wild tubers:

First they roast them in open fires, one end at a time, turning them occasionally for a total of 5 to 15 minutes cooking. Next they peel them, then bite off a piece and chew it for half to three minutes (just pause a moment to consider that!) Then they spit out a wad of fibres called a quid. Depending on the particular tuber, the edible fraction varies from 20% to 80%, with the remainder being inedible fibre or peel. It is clear from this that African wild tubers are not remotely like any modern root vegetables – especially potatoes.


Hadza man roasting ekwa (Vigna frutescens) tubers – typical of those available during hominid evolution. Recent analysis by Schnorr et al found that only a small fraction of the peeled tubers glucose content (26 ± 8%) was available to digestion. Image courtesy Gary Aitken 2014

For the purpose of comparison, I’ve cobbled together some approximate available glucose values for the Hadza tubers and common supermarket tubers (in both cases fructose content has been excluded as both Hardy’s and Schnorr’s papers focus solely on glucose). The quantities for the Hadza tubers are based on 100g of edible tuber after peeling and discarding the fibrous quid, meaning that the glucose would actually be even more dilute than these figures suggest if they were for the whole tuber. You can see that these Hadza tubers do not come close to the carbohydrate density of modern starchy vegetables like potatoes, being closer to carrots.

Table 1. Available glucose and fibre content of Hadza v Modern tubers (g/100g)

Hadza tubers
Glucose Fibre* Modern tubers
 Glucose Fibre
(E. entennulifa)
2 10 Carrot 4 3
(V. pseudolablab)
2 15 Parsnip 9 5
(V. frutescens)
3 20 Potato 17 2
(I. transvaalensis)
8 1 Sweet potato 18 3

* Approximate fibre content of edible portion of Hadza tubers

What is clear from these data is that wild Hadza tubers are significantly lower in glucose and higher in fibre than modern root vegetables. The highest level of glucose in the Hadza tubers is comparable with parsnips – which are hardly ‘starchy’. Potatoes – which definitely are starchy – are the least similar to wild African tubers.

3. Does cooking significantly affect African tuber digestibility?

So what about Hardy’s hypothesis, that cooking tubers increased the bio-availability of glucose giving humans a boost in the brain development game? Schnorr et al found that although the Hadza roasted their tubers, this did little to improve digestibility. The main benefit of roasting seemed to be to assist with peeling. Of the four tuber species eaten by the Hadza two showed no change in glucose availability, in the other two the available glucose increased from 38% to 48%, whereas in the forth species it decreased, from 44% to 34%. This makes Hardy’s hypothesis look quite shaky.

Furthermore, the authors estimated that cooking and chewing only liberated 1/3 to 2/3 of the glucose content of the tubers and classified them as “very resistant to digestion”. By contrast, a modern baked potato requires only seconds of chewing and will be almost 100% digested, spiking the blood glucose levels dramatically within minutes of ingestion. Yet potatoes were used extensively in the news paper articles reporting on Hardy’s hypothesis.

It is worth considering what happens to the remaining 1/3 to 2/3 of the undigested starches in the Hadza tubers. These are not necessarily lost, as upon reaching the colon bacteria would get to work on them, converting them into easily absorbed short chain fatty acids – fats, that is. However, Hardy’s hypothesis emphasises tuber eating for a plentiful supply of glucose, not fatty acids, to fuel hominid brain expansion. What the Hadza study shows is that their tubers are a low-glycaemic food, in no way comparable to modern starchy tubers like potatoes. There is actually no contemporary food with which to compare them. We just don’t eat anything so fibrous.

A further interesting finding in the Hazda tuber study is summarised in this graph:


Schnorr et al, found that larger tubers – those with the highest levels of starch (bottom axis) – were the most resistant to digestion (vertical axis). They concluded that the human digestive system is simply unable to cope with large quantities of these wild tubers at one sitting.

This means that absorption of glucose may be inhibited by the higher glucose content, such as found in starch, since these glucose polymers can resist digestion or overwhelm the enzyme activity in the small intestine. Therefore, a “glucose”-rich tuber does not necessarily mean more glucose, proportionally, is directly available to the consumer.

If larger roots provide a lower percentage of available glucose Hardy’s hypothesis appears less tenable still, relying as it does on the widespread availability of starchy roots to fuel an increasingly metabolically-expensive brain. In contradistinction, Schnorr’s work suggests humans have a physiological limit on the quantity of wild tubers they can digest.

4. What is the significance of human salivary amylase adaptations

At some point in human evolution mutations took place resulting in multiple copies of the genes for salivary amylase, the enzyme that breaks down starch. It had been assumed that this took place during the neolithic switch to agriculture as an adaptation to the new staple – starch rich cereal grains. However, recent evidence indicates this adaptation goes back further in human evolution, although how far back remains a matter of debate.

Hardy et al suggest the multiple amylase gene copy number mutation arose in conjunction with human exploitation of cooked tubers.

We propose that after cooking became widespread, starch digestion became the rate-limiting step in starch utilization, and the coevolution of cooking and copy number variation (CNV) in the [amylase] gene(s) increased availability of preformed dietary glucose, permitting the acceleration in brain size increase observed from the Middle Pleistocene onward. – Hardy et al, 2015

This may or may not turn out to be correct. If so, however, it does little to rescue Hardy’s hypothesis, because, as we have seen these tubers have to be roasted, then masticated for minutes to extract minimal amounts of glucose. Rather than being a driving force for brain expansion, exploitation of tubers appears more like a diversification strategy – a valuable source of low-glycaemic calories that would have been helpful towards meeting the daily caloric needs when no better sources of nutrition were available, not the massive brain-fueling dose of glucose we associate with modern starchy vegetables.

Tubers are a reliable fallback food


Hadza food preferences, by sex. From Marlowe & Berbesque Tubers as fallback foods and their impact on Hadza hunter-gatherers. American Journal of Physical Anthropology, 2009

Harvey et al argue that tubers would have been a highly prized food by our ancestors, however this does not appear to be the case with modern hunter gatherers. For example the Hadza rely on five main categories of foods: tubers, berries, meat, baobab, and honey. In a 2009 study in the American Journal of Anthropology in 2009 Marlowe & Berbesque found that the Hadza rated tubers as the least preferred of their foods. They rated all other foods more highly than tubers.

As such tubers are seen very much as a fallback food – something to eat when there is insufficient of the good stuff, to top up calories, or to avoid starving.

The use of tubers as a  fallback food is also observed in primates, who will resort to collecting underground storage organs on land or from water plants when preferred foods are in short supply. (Savanna chimpanzees use tools to harvest the underground storage organs of plants, R. Adriana Hernandez-Aguila et al, 2007)

Schnorr et al, also found that on average tubers only made up about 20% by weight of the food brought back to camp each day. The caloric contribution of tubers is not well established, but is likely to be low. Indeed Schnorr and his team, found that the most commonly eaten Hadza tubers were 80-90% water by weight, indicating that they make only a small contribution to daily calories, perhaps as little as 5 or 10%. They suggest these watery tubers “may actually be more important for their moisture rather than caloric contribution”.

So where does this leave the hypothesis of Harvey and her team? Taking account all of the above factors, and realising that wild African tubers are nothing like the familiar spud, the idea that cooked starchy tubers drove human brain evolution starts to look far less plausible. Certainly, tubers have always been widely available, but, apparently not preferred, at least if the Hadza are anything to go by.

Their primary role as a food of last resort makes sense once their low-glucose, high-fibre nature is understood and the image of the modern potato is banished from one’s consideration. The effects of cooking on starches and the salivary amylase adaptations – assuming they did indeed arise early in human brain expansion – rather than providing evidence for a key driving force in human evolution begin to look like adaptations for survival in extremis, when higher quality foods were not available.

Instead of being the driving force behind human brain expansion, Hardy et al‘s hypothesis might well come to be seen as little more than a rather peripheral aspect of dietary adaptation.

An alternative hypothesis on the role of wild tubers on human brain evolution

As you can see from the graph at the top of this post, human brain expansion began two million years earlier than Hardy’s date of 800,000 years ago. Isotopic evidence from that period shows that early humans (Australopithicus) had ‘C4 plants’ in their diet which primarily indicates grasses and sedges (Lee-Thorp J et al, 2012).
Water_chestnut_plantThis is distinct from other primates at the time, which were eating mainly C3 plants – indicative of leaves, nuts and fruits from forests. During this period a dryer climate was creating more grassland and early man seems to be taking advantage of this new savanna environment. However, there are two ways the C4 isotope could appear in Australopiths – either they were eating grasses directly, or they were eating herbivores that ate those grasses. Quite possibly both of these took place.

What is not likely, however, is that early humans ate the foliage or seeds of grasses – these being virtually indigestible – but rather they were eating the underground storage organs (bulbs and tubers) of various species of sedge (Laden G, Wrangham R, 2005). Sedges grow in and around water, and often have tubers which can easily be pulled from the soft mub at the bottom of ponds or river sides (see image, right).

Although small, they can be collected year round and, importantly, can be eaten raw. This makes tuber eating a possible factor in early human brain expansion, prior to the domestication of fire.

You may already have eaten such tubers yourself. The most common culinary varieties being water chestnuts (Eleocharis dulcis) – popular in oriental cooking – and, less well known, tiger nuts (Cyperus esculentus).

Water chestnuts and tiger nuts are quite high in digestible carbohydrates and fibre, and along with similar species certainly present a more realistic food source for early humans. Indeed it has been estimated that just 150-200g of tiger nuts contains sufficient essential fats to satisfy human needs (Nathaniel J. Dominy, ‘Hominins living on the sedge’ 2012). The fact that other primates make use of them as fallback foods suggests it is unlikely that on their own they could account for the unique evolutionary force that drove human brain expansion over the subsequent 2.5 million years, but it may have contributed to getting the process started.


Namibian boy gathering frogs for cooking.

A much more exciting hypothesis, is that early humans specialised in water resources of all kinds. This would have provided fish and shellfish, along with water tubers from sedges and cyprus plants.

Broader water resource utilisation may also explain why humans became bipedal – to wade (Kuliukas, 2002). Whilst sedge tubers may have provided glucose to fuel the early brain, we have seen that this is not necessarily a pre-requisite for brain expansion as gluconeogenesis and ketogenesis can provide all the fuel a human brain needs. More important for brain development is access to sources of brain-building fatty acids and minerals (‘brain selective nutrients’) such as iodine, iron, zinc, selenium and omega-3 fatty acids (especially DHA) – dietary components that in all other land mammals place significant limits on the relative size of their brains.

A shore-based diet, i.e., fish, molluscs, crustaceans, frogs, bird’s eggs and aquatic plants, provides the richest known dietary sources of brain selective nutrients. Regular access to these foods by the early hominin lineage that evolved into humans would therefore have helped free the nutritional constraint on primate brain development and function.
– Cunnane & Crawford, 2014

Addendum. Why tubers are OK as part of an Ancestral or Paleo diet.

The arguments above are to illustrate that the evolution of the human brain probably did not depend on access to cooked starchy tubers as Hardy et al have claimed.  However, I do not believe there is anything wrong with eating root vegetables, and I would recommend them as part of a healthy modern Paleo diet, especially with some tiger nuts and water chestnuts thrown in. Our supermarket root vegetables are still real, unprocessed, foods – much closer to wild tubers than virtually any other modern carbohydrate source. Selective breeding has produced roots that are far more appetising and less fibrous than their wild counterparts, so we may see them as somewhat better than a fallback food. That said, I would have to be in dire straights before trying to survive on carrots alone, and although I like my meat and two veg, I guess my preference chart would not look too dissimilar to the Hadzas’!


Villagers gather to share their crop of sweet potatoes which have been grown for hundreds of years in the highlands of Papua New Guinea

Tubers such as sweet potatoes and taro are staple foods of some very healthy primitive agriculturalist tribes such as the Kitavans, Tukisenta and the highlanders of Papua New Guinea. These people eat high carbohydrate diets, based around starchy tubers but suffer none of the western diseases of civilisation when they stick to their traditional diets. However, these examples do nothing to rescue Hardy’s hypothesis as the agricultural practices necessary to grow such crops have only been around for at most 10,000 years. Without cultivation, such tubers could only provide a minor part of the diet. That said, they do point to the ability of humans to thrive on a high carbohydrate diet where these carbs are primarily from unrefined starchy vegetables. The key difference between such ancestral diets and modern western diets may well be the dominance of foods based on refined grains which have far higher carbohydrate densities, as illustrated in the charts below.

The bottom line is, ancestral and paleo foods have always included tubers, root vegetables and bulbs. However, it seems likely that other dietary factors were responsible for the evolution of the unique human brain, and we would do well to pay attention to those foods first: fish, shellfish, eggs… and frogs!

Seafood, Sex and Evolution (3 videos)

This talk was originally given at The Bassil Shippam Centre, Chichester, West Sussex, England on Saturday 18th July 2015. A live audio recording of the event has been combined with the original presentation slides to create the videos below. A few additional slides have been included for clarity or to illustrate answers to audience questions.

We explore the role of seafood in evolutionary and contemporary health. Groundbreaking research is presented in a clear, accessible and entertaining manner. Ideas covered include:

  • The intimate role of fish and shellfish in human evolution
  • How key nutrients needed for brain development are found in seafood
  • How to choose seafood to maximise the health benefits and avoid contaminants
  • How human female curvaceousness and baby fat are unique in the animal kingdom and the remarkable link between these and seafood

[Credits: Music "Equinox", courtesy of longzijun]

Feedback from attendees at the original talk

Hi Afifah and Keir,
Thank you very much for another spellbinding talk!  I learnt so much – literally food for thought.  Thanks for all the effort you put into the research and presentation.
– Rhiannon
Dear Afifah,
I wanted to thank you and your partner for giving such an interesting talk about seafood on Saturday. I particularly liked the way you combined the history of human evolution with the biochemistry.
– David
Dear Afifah and Keir,
What an enlightening evening ! We always learn so much at your talks. I love the thoroughness of your research and the way your conclusions are so irresistably feasible!! I concentrated on every fact ( but can’t remember all if it now! ). It was great and really makes you think- which I like very much …
Must do the squid ‘pasta’ again. Also inspired by the recipes on your blog and must do the salmon and mushroom one.
– Jane and Michael
Hi Afifah and Keir,
Many congratulations on your talk last Saturday – you certainly excelled yourselves and my 5 pages of notes are testimony to that!!
– Bill

After watching the videos above, please leave your own feedback and comments below. Thanks!

Milk & Alcohol – was the good Doctor on to something?

In 1978 when Nick Lowe and John Mayo penned the lyrics to Dr Feelgood’s “Milk & Alcohol” I doubt they had any inkling of the nutritional significance of these two Neolithic beverages, or the dramatic effect that their adoption had on the course of human evolution. Recent research, however, indicates that in some ways they have quite opposite effects on our health… Continue reading