top of page

Facultative Carnivore Reasons

phylogeny-feeding-time

Smaller molars and less time spent feeding despite body size show that meat eating was causal in our evolution.

Phylogenetic rate shifts in feeding time during the evolution of Homo

Starting with Homo erectus, humans developed smaller molars and also began to spend a lot less time on feeding than would be predicted from body mass and phylogeny with other apes (only 5% instead of a predicted 48% of daily activity in Homo sapiens) [Organ et al. 2011].


Phylogenetic rate shifts in feeding time during the evolution of Homo

Chris Organ, Charles L. Nunn, Zarin Machanda, and Richard W. Wrangham


Abstract

Unique among animals, humans eat a diet rich in cooked and nonthermally processed food. The ancestors of modern humans who invented food processing (including cooking) gained critical advantages in survival and fitness through increased caloric intake. However, the time and manner in which food processing became biologically significant are uncertain. Here, we assess the inferred evolutionary consequences of food processing in the human lineage by applying a Bayesian phylogenetic outlier test to a comparative dataset of feeding time in humans and nonhuman primates. We find that modern humans spend an order of magnitude less time feeding than predicted by phylogeny and body mass (4.7% vs. predicted 48% of daily activity). This result suggests that a substantial evolutionary rate change in feeding time occurred along the human branch after the human–chimpanzee split. Along this same branch, Homo erectus shows a marked reduction in molar size that is followed by a gradual, although erratic, decline in H. sapiens. We show that reduction in molar size in early Homo (H. habilis and H. rudolfensis) is explicable by phylogeny and body size alone. By contrast, the change in molar size to H. erectus, H. neanderthalensis, and H. sapiens cannot be explained by the rate of craniodental and body size evolution. Together, our results indicate that the behaviorally driven adaptations of food processing (reduced feeding time and molar size) originated after the evolution of Homo but before or concurrent with the evolution of H. erectus, which was around 1.9 Mya.

chewing-meat

Decrease in teeth size, jawbones, reduction of chewing muscles, and weaker bite force indicate shift to animal source foods

Impact of meat and Lower Palaeolithic food processing techniques on chewing in humans

Furthermore, the shift from fibrous plants to including ASFs, together with the use of tools, paralleled a decrease in teeth size and jawbones, a reduction in chewing muscles, and weaker maximum bite force capabilities [Teaford & Ungar 2000; Zink & Lieberman 2016].

Diet and the evolution of the earliest human ancestors

Mark F. Teaford and Peter S. Ungar

Abstract

Over the past decade, discussions of the evolution of the earliest human ancestors have focused on the locomotion of the australopithecines. Recent discoveries in a broad range of disciplines have raised important questions about the influence of ecological factors in early human evolution. Here we trace the cranial and dental traits of the early australopithecines through time, to show that between 4.4 million and 2.3 million years ago, the dietary capabilities of the earliest hominids changed dramatically, leaving them well suited for life in a variety of habitats and able to cope with significant changes in resource availability associated with long-term and short-term climatic fluctuations.

Since the discovery of Australopithecus afarensis, many researchers have emphasized the importance of bipedality in scenarios of human origins (1, 2). Surprisingly, less attention has been focused on the role played by diet in the ecology and evolution of the early hominids (as usually received). Recent work in a broad range of disciplines, such as paleoenvironmental studies (3, 4), behavioral ecology (5), primatology (6), and isotope analyses (7), has rekindled interests in early hominid diets. Moreover, important new fossils from the early Pliocene raise major questions about the role of dietary changes in the origins and early evolution of the Hominidae (810). In short, we need to focus not just on how the earliest hominids moved between food patches, but also on what they ate when they got there.

This paper presents a review of the fossil evidence for the diets of the Pliocene hominids Ardipithecus ramidus, Australopithecus anamensis, Australopithecus afarensis, and Australopithecus africanus. These hominids offer evidence for the first half of human evolution, from our split with prehistoric apes to the earliest members of our own genus, Homo. The taxa considered are viewed as a roughly linear sequence from Ardipithecus to A. africanus, spanning the time from 4.4 million to 2.5 million years ago. As such, they give us a unique opportunity to examine changes in dietary adaptations of our ancestors over nearly 2 million years. We also trace what has been inferred concerning the diets of the Miocene hominoids to put changes in Pliocene hominid diets into a broader temporal perspective. From such a perspective, it becomes clear that the dietary capabilities of the early hominids changed dramatically in the time period between 4.4 million and 2.3 million years ago. Most of the evidence has come from five sources: analyses of tooth size, tooth shape, enamel structure, dental microwear, and jaw biomechanics. Taken together, they suggest a dietary shift in the early australopithecines, to increased dietary flexibility in the face of climatic variability. Moreover, changes in diet-related adaptations from A. anamensis to A. afarensis to A. africanus suggest that hard, abrasive foods became increasingly important through the Pliocene, perhaps as critical items in the diet.

Impact of meat and Lower Palaeolithic food processing techniques on chewing in humans

  • Katherine D. Zink &

  • Daniel E. Lieberman

Abstract
The origins of the genus Homo are murky, but by H. erectus, bigger brains and bodies had evolved that, along with larger foraging ranges, would have increased the daily energetic requirements of hominins1,2. Yet H. erectus differs from earlier hominins in having relatively smaller teeth, reduced chewing muscles, weaker maximum bite force capabilities, and a relatively smaller gut3,4,5. This paradoxical combination of increased energy demands along with decreased masticatory and digestive capacities is hypothesized to have been made possible by adding meat to the diet6,7,8, by mechanically processing food using stone tools7,9,10, or by cooking11,12. Cooking, however, was apparently uncommon until 500,000 years ago13,14, and the effects of carnivory and Palaeolithic processing techniques on mastication are unknown. Here we report experiments that tested how Lower Palaeolithic processing technologies affect chewing force production and efficacy in humans consuming meat and underground storage organs (USOs). We find that if meat comprised one-third of the diet, the number of chewing cycles per year would have declined by nearly 2 million (a 13% reduction) and total masticatory force required would have declined by 15%. Furthermore, by simply slicing meat and pounding USOs, hominins would have improved their ability to chew meat into smaller particles by 41%, reduced the number of chews per year by another 5%, and decreased masticatory force requirements by an additional 12%. Although cooking has important benefits, it appears that selection for smaller masticatory features in Homo would have been initially made possible by the combination of using stone tools and eating meat.

meat-and-nicotinamide

Meat and Nicotinamide

Meat and Nicotinamide: A Causal Role in Human Evolution, History, and Demographics

Meat and Nicotinamide: A Causal Role in Human Evolution, History, and Demographics - 2017 Adrian C Williams and Lisa J Hill

Hunting for meat was a critical step in all animal and human evolution. A key brain-trophic element in meat is vitamin B3 / nicotinamide. The supply of meat and nicotinamide steadily increased from the Cambrian origin of animal predators ratcheting ever larger brains. This culminated in the 3-million-year evolution of Homo sapiens and our overall demographic success. We view human evolution, recent history, and agricultural and demographic transitions in the light of meat and nicotinamide intake. A biochemical and immunological switch is highlighted that affects fertility in the ‘de novo’ tryptophan-to-kynurenine-nicotinamide ‘immune tolerance’ pathway. Longevity relates to nicotinamide adenine dinucleotide consumer pathways. High meat intake correlates with moderate fertility, high intelligence, good health, and longevity with consequent population stability, whereas low meat/high cereal intake (short of starvation) correlates with high fertility, disease, and population booms and busts. Too high a meat intake and fertility falls below replacement levels. Reducing variances in meat consumption might help stabilise population growth and improve human capital.

Meat, NAD, and Human Evolution

Archaeological and palaeo-ontological evidence indicate that hominins increased meat consumption and developed the necessary fabricated stone tools while their brains and their bodies evolved for a novel foraging niche and hunting range, at least 3 million years ago. This ‘cradle of mankind’ was centred around the Rift Valley in East Africa where the variable climate and savannah conditions, with reductions in forests and arboreal living for apes, may have required clever and novel foraging in an area where overall prey availability but also predator dangers were high44–50 (Figure 2). Tools helped hunting and butchery and reduced time and effort spent chewing as did cooking later.51 Another crucial step may have been the evolution of a cooperative social unit with divisions of labour, big enough to ensure against the risks involved in hunting large game and the right size to succeed as an ambush hunter – with the requisite prosocial and altruistic skills to also share the spoil across sexes and ages.52 The ambitious transition from prey to predator hunting the then extensive radiation of megaherbivores so big that they are normally considered immune to carnivores, needed advanced individual and social cognition as humans do not have the usual physical attributes of a top predator.53–59 Adult human requirements to run such big brains are impressive enough, but during development, they are extraordinarily high with 80% to 90% of basal metabolic rate necessary in neonates – this is probably not possible after weaning without the use of animal-derived foods.51,60,61

endurance-running

Long distance endurance running may have played a role in persistence hunting

Endurance running and the evolution of Homo

Endurance running and the evolution of Homo

Abstract

Striding bipedalism is a key derived behaviour of hominids that possibly originated soon after the divergence of the chimpanzee and human lineages. Although bipedal gaits include walking and running, running is generally considered to have played no major role in human evolution because humans, like apes, are poor sprinters compared to most quadrupeds. Here we assess how well humans perform at sustained long-distance running, and review the physiological and anatomical bases of endurance running capabilities in humans and other mammals. Judged by several criteria, humans perform remarkably well at endurance running, thanks to a diverse array of features, many of which leave traces in the skeleton. The fossil evidence of these features suggests that endurance running is a derived capability of the genus Homo, originating about 2 million years ago, and may have been instrumental in the evolution of the human body form.


Rethinking the evolution of the human foot: insights from experimental research

Nicholas B. Holowka, Daniel E. Lieberman

Journal of Experimental Biology 2018 221: jeb174425 doi: 10.1242/jeb.174425 Published 6 September 2018

Abstract

Adaptive explanations for modern human foot anatomy have long fascinated evolutionary biologists because of the dramatic differences between our feet and those of our closest living relatives, the great apes. Morphological features, including hallucal opposability, toe length and the longitudinal arch, have traditionally been used to dichotomize human and great ape feet as being adapted for bipedal walking and arboreal locomotion, respectively. However, recent biomechanical models of human foot function and experimental investigations of great ape locomotion have undermined this simple dichotomy. Here, we review this research, focusing on the biomechanics of foot strike, push-off and elastic energy storage in the foot, and show that humans and great apes share some underappreciated, surprising similarities in foot function, such as use of plantigrady and ability to stiffen the midfoot. We also show that several unique features of the human foot, including a spring-like longitudinal arch and short toes, are likely adaptations to long distance running. We use this framework to interpret the fossil record and argue that the human foot passed through three evolutionary stages: first, a great ape-like foot adapted for arboreal locomotion but with some adaptations for bipedal walking; second, a foot adapted for effective bipedal walking but retaining some arboreal grasping adaptations; and third, a human-like foot adapted for enhanced economy during long-distance walking and running that had lost its prehensility. Based on this scenario, we suggest that selection for bipedal running played a major role in the loss of arboreal adaptations.

eye-gazing-hunting

Eyes - Humans can communicate with gazes, useful for communication when hunting.

Unique morphology of the human eye and its adaptive meaning: comparative studies on external morphology of the primate eye

Unique morphology of the human eye and its adaptive meaning: comparative studies on external morphology of the primate eye

Abstract

In order to clarify the morphological uniqueness of the human eye and to obtain cues to understanding its adaptive significance, we compared the external morphology of the primate eye by measuring nearly half of all extant primate species. The results clearly showed exceptional features of the human eye: (1) the exposed white sclera is void of any pigmentation, (2) humans possess the largest ratio of exposed sclera in the eye outline, and (3) the eye outline is extraordinarily elongated in the horizontal direction. The close correlation of the parameters reflecting (2) and (3) with habitat type or body size of the species examined suggested that these two features are adaptations for extending the visual field by eyeball movement, especially in the horizontal direction. Comparison of eye coloration and facial coloration around the eye suggested that the dark coloration of exposed sclera of nonhuman primates is an adaptation to camouflage the gaze direction against other individuals and/or predators, and that the white sclera of the human eye is an adaptation to enhance the gaze signal. The uniqueness of human eye morphology among primates illustrates the remarkable difference between human and other primates in the ability to communicate using gaze signals.

man-the-fat-hunter

Man the Fat Hunter

Man the Fat Hunter: The Demise of Homo erectus and the Emergence of a New Hominin Lineage in the Middle Pleistocene (ca. 400 kyr) Levant

It is likely that humans preferred large herbivores given the abundance of their biomass, the relative ease of hunting them, net caloric returns, and their higher fat content, which accommodates physiological limits on protein consumption


Man the Fat Hunter: The Demise of Homo erectus and the Emergence of a New Hominin Lineage in the Middle Pleistocene (ca. 400 kyr) Levant

Abstract

The worldwide association of H. erectus with elephants is well documented and so is the preference of humans for fat as a source of energy. We show that rather than a matter of preference, H. erectus in the Levant was dependent on both elephants and fat for his survival. The disappearance of elephants from the Levant some 400 kyr ago coincides with the appearance of a new and innovative local cultural complex – the Levantine Acheulo-Yabrudian and, as is evident from teeth recently found in the Acheulo-Yabrudian 400-200 kyr site of Qesem Cave, the replacement of H. erectus by a new hominin. We employ a bio-energetic model to present a hypothesis that the disappearance of the elephants, which created a need to hunt an increased number of smaller and faster animals while maintaining an adequate fat content in the diet, was the evolutionary drive behind the emergence of the lighter, more agile, and cognitively capable hominins. Qesem Cave thus provides a rare opportunity to study the mechanisms that underlie the emergence of our post-erectus ancestors, the fat hunters.

large-prey-hypocarnivory

Hunting large prey as humans did is exclusively associated with hypercarnivory

The impact of large terrestrial carnivores on Pleistocene ecosystems

The impact of large terrestrial carnivores on Pleistocene ecosystems

Significance

At very high densities, populations of the largest herbivores, such as elephants, have devastating effects on the environment. What prevented widespread habitat destruction in the Pleistocene, when the ecosystem sustained many species of huge herbivores? We use data on predator–prey body mass relationships to predict the prey size ranges of large extinct mammalian carnivores, which were more diverse and much larger than living species. We then compare these prey size ranges with estimates of young mammoth sizes and show that juvenile mammoths and mastodons were within predicted prey size ranges of many of the Pleistocene carnivores. From this and other fossil evidence we argue that, by limiting population sizes of megaherbivores, large carnivores had a major impact on Pleistocene ecosystems.

caries-hypocarnivory

Caries related to hypocarnivory

Caries Through Time: An Anthropological Overview

Although tooth plaque isn’t suitable for determining the HTL within the vast zooarchaeological landscape [53, 54], it may be marginally accurate for identifying shifts. For instance, Neanderthals were known to rely heavily on animal-sourced foods and only showed six caries (signs of decay) out of 1,250 of their teeth that were examined [55]. Caries started appearing in substantial numbers between 13,700 and 15,000 years ago in Morocco, alongside evidence of increased starch consumption [56]. The low occurrence of caries during most of the Pleistocene corresponds to a low carbohydrate, high HTL pattern.


Hard plant tissues do not contribute meaningfully to dental microwear: evolutionary implications

Abstract
Reconstructing diet is critical to understanding hominin adaptations. Isotopic and functional morphological analyses of early hominins are compatible with consumption of hard foods, such as mechanically-protected seeds, but dental microwear analyses are not. The protective shells surrounding seeds are thought to induce complex enamel surface textures characterized by heavy pitting, but these are absent on the teeth of most early hominins. Here we report nanowear experiments showing that the hardest woody shells – the hardest tissues made by dicotyledonous plants – cause very minor damage to enamel but are themselves heavily abraded (worn) in the process. Thus, hard plant tissues do not regularly create pits on enamel surfaces despite high forces clearly being associated with their oral processing. We conclude that hard plant tissues barely influence microwear textures and the exploitation of seeds from graminoid plants such as grasses and sedges could have formed a critical element in the dietary ecology of hominins.


Non-occlusal dental microwear variability in a sample of Middle and Late Pleistocene human populations from Europe and the Near East

Abstract

Non-occlusal, buccal tooth microwear variability has been studied in 68 fossil humans from Europe and the Near East. The microwear patterns observed suggest that a major shift in human dietary habits and food processing techniques might have taken place in the transition from the Middle to the Late Pleistocene populations. Differences in microwear density, average length, and orientation of striations indicate that Middle Pleistocene humans had more abrasive dietary habits than Late Pleistocene populations. Both dietary and cultural factors might be responsible for the differences observed. In addition, the Middle Paleolithic Neanderthal specimens studied show a highly heterogeneous pattern of microwear when compared to the other samples considered, which is inconsistent with a hypothesis of all Neanderthals having a strictly carnivorous diet. The high density of striations observed in the buccal surfaces of several Neanderthal teeth might be indicative of the inclusion of plant foods in their diet. The buccal microwear variability observed in the Neanderthals is compatible with an overall exploitation of both plant and meat foods on the basis of food availability. A preliminary analysis of the relationship between buccal microwear density and climatic conditions prevailing in Europe during the Late Pleistocene has been attempted. Cold climatic conditions, as indicated by oxygen isotope stage data, seem to be responsible for higher densities of microwear features, whereas warmer periods could correspond to a reduced pattern of scratch density. Such a relationship would be indicative of less abrasive dietary habits, perhaps more meat dependent, during warmer periods.


Caries Through Time: An Anthropological Overview


Bioanthropological researches carried out in the last few decades have given special emphasis to the study of the relation between disease, as well as social and environmental phenomena, enhancing the already strong connection between lifestyle and health conditions during history of humankind (Cohen & Armelagos, 1984; Katzenberg & Saunders, 2008; Larsen, 1997). Because infectious diseases result from the interaction between host and agent, modulated by ecological and cultural environments, the comparative study of the historic prevalence of diseases in past populations worldwide can provide important data about their related factors and etiology. The study of dental diseases (such as caries) has been given special attention from Paleopathology2. The tooth, for its physical features tends to resist destruction and taphonomic conditions better than any other body tissue and therefore, is a valuable element for the study on individual’s diet, and social and cultural factors related to it, from a population perspective. Caries is one of the infectious diseases more easily observable in human remains retrieved from archaeological excavations. For their long time of development and non-lethal nature the lesions presented at the time of the death remain recognizable indefinitely, allowing to infer, along with other archaeological and ecological data, the types of food that a specific population consumed, the cooking technology they used, the relative frequency of consumption, and the way the food was shared among the group (Hillson, 2001 2008; Larsen, 1997; Rodríguez, 2003).


Earliest evidence for caries and exploitation of starchy plant foods in Pleistocene hunter-gatherers from Morocco

Significance

We present early evidence linking a high prevalence of caries to a reliance on highly cariogenic wild plant foods in Pleistocene hunter-gatherers from North Africa. This evidence predates other high caries populations and the first signs of food production by several thousand years. We infer that increased reliance on wild plants rich in fermentable carbohydrates caused an early shift toward a disease-associated oral microbiota. Systematic harvesting and processing of wild food resources supported a more sedentary lifestyle during the Iberomaurusian than previously recognized. This research challenges commonly held assumptions that high rates of caries are indicative of agricultural societies.

tooth-enamel

Early Homo indistinguishable from carnivores using modified trace elements method on tooth samples.

Evidence for dietary change but not landscape use in South African early hominins

Using a modified trace elements method on tooth samples, early Homo was found to be indistinguishable from carnivores [52].


Evidence for dietary change but not landscape use in South African early hominins


Abstract

The dichotomy between early Homo and Paranthropus is justified partly on morphology. In terms of diet, it has been suggested that early Homo was a generalist but that Paranthropus was a specialist. However, this model is challenged and the issue of the resources used by Australopithecus, the presumed common ancestor, is still unclear. Laser ablation profiles of strontium/calcium, barium/calcium and strontium isotope ratios in tooth enamel are a means to decipher intra-individual diet and habitat changes. Here we show that the home range area was of similar size for species of the three hominin genera but that the dietary breadth was much higher in Australopithecus africanus than in Paranthropus robustus and early Homo. We also confirm that P. robustus relied more on plant-based foodstuffs than early Homo. A South African scenario is emerging in which the broad ecological niche of Australopithecus became split, and was then occupied by Paranthropus and early Homo, both consuming a lower diversity of foods than Australopithecus.

top-level-carnivores

Stable isotope studies suggest European hunter-gatherers mostly ate a carnivorous diet during the Upper Paleolithic, placing them at or above the trophic level of wolves [51].

Isotopic evidence for the diets of European Neanderthals and early modern humans

Stable isotope studies suggest European hunter-gatherers mostly ate a carnivorous diet during the Upper Paleolithic, placing them at or above the trophic level of wolves [51].


Isotopic evidence for the diets of European Neanderthals and early modern humans

Abstract

We report here on the direct isotopic evidence for Neanderthal and early modern human diets in Europe. Isotopic methods indicate the sources of dietary protein over many years of life, and show that Neanderthals had a similar diet through time (≈120,000 to ≈37,000 cal BP) and in different regions of Europe. The isotopic evidence indicates that in all cases Neanderthals were top-level carnivores and obtained all, or most, of their dietary protein from large herbivores. In contrast, early modern humans (≈40,000 to ≈27,000 cal BP) exhibited a wider range of isotopic values, and a number of individuals had evidence for the consumption of aquatic (marine and freshwater) resources. This pattern includes Oase 1, the oldest directly dated modern human in Europe (≈40,000 cal BP) with the highest nitrogen isotope value of all of the humans studied, likely because of freshwater fish consumption. As Oase 1 was close in time to the last Neanderthals, these data may indicate a significant dietary shift associated with the changing population dynamics of modern human emergence in Europe.

zooarchaeological-trends

Anthropologists have argued that carnivory was crucial for our survival of the extremely cold Eurasian winter

Carnivory, Coevolution, and the Geographic Spread of the Genus Homo

Carnivory, Coevolution, and the Geographic Spread of the Genus Homo

Abstract

This review traces the colonization of Eurasia by hominids some 1,700,000 years ago and their subsequent evolution there to 10,000 years ago from a carnivorous perspective. Three zooarchaeological trends reflect important shifts in hominid adaptations over this great time span: (1) increasing predation on large, hoofed animals that culminated in prime-adult–biased hunting, a predator–prey relationship that distinguishes humans from all other large predators and is a product of coevolution with them; (2) greater diet breadth and range of foraging substrates exploited in response to increasing human population densities, as revealed by small-game use; and (3) increased efficiency in food capture, processing, and energy retention through technology, and the eventual expansion of technology into social (symbolic) realms of behavior. Niche boundary shifts, examined here in eight dimensions, tend to cluster at 500 thousand years ago (KYA), at 250 KYA, and several in rapid succession between 50 and 10 KYA. Most of these shifts appear to be consequences of competitive interaction, because high-quality, protein-rich resources were involved. Many of the boundary shifts precede major radiations in the equipment devoted to animal exploitation. With a decline in trophic level after 45 KYA, demographic increase irreversibly altered the conditions of natural selection on human societies, from a largely interspecific competitive forum to one increasingly defined by intraspecific pressures. Regionalization of Upper Paleolithic artifact styles is among the many symptoms of this process.

dysevolution

Dysevolution may happen in hypocarnivorous scenarios

Dietary mismatches might cause deleterious feedback loops.

Harvard biologist Daniel Lieberman: Lieberman defines dysevolution as “the deleterious feedback loop that occurs over multiple generations when we don’t treat the causes of a mismatch disease but instead pass on whatever environmental factors cause the disease, keeping the disease prevalent and sometimes making it worse.” A “mismatch disease” begins “when we get sick or injured from an evolutionary mismatch that results from being inadequately adapted to a change in the body’s environment.” You can read more about dysevolution in Lieberman’s book The Story of the Human Body: Evolution, Health, and Dis­ease (New York: Pantheon, 2013); the quote is from p. 176. See also Jeff Wheelwright, “From Diabetes to Athlete’s Foot, Our Bodies Are Maladapted for Modern Life,” Discover, Apr. 2, 2015, http://discovermagazine.com/2015/may/16-days-of-dysevolution

bottom of page