top of page

Facultative Carnivore Reasons

throwing-performance

Humans may have evolved the ability to throw with accuracy in order to hunt.

Clavicle length, throwing performance and the reconstruction of the
Homo erectus shoulder

Clavicle length, throwing performance and the reconstruction of the Homo erectus shoulder

Powerful, accurate throwing may have been an important mode of early hunting and defense. Previous work has shown that throwing performance is functionally linked to several anatomical shifts in the upper body that occurred during human evolution. The final shift to occur is the inferior reorientation of the shoulder. Fossil scapulae show the earliest evidence of a more inferior glenoid in Homo erectus. However, where the scapula rests on the thorax is uncertain. The relative length of the clavicle, the only skeletal attachment of the scapula to the torso, is quite variable. Depending on which fossils or skeletal measures are used to reconstruct the H. erectus shoulder, either a novel, anteriorly facing shoulder configuration or a modern human-like lateral orientation is possible. These competing hypotheses have led to very different conclusions regarding the throwing ability and hunting behavior of early Homo. Here, we evaluate competing models of H. erectus shoulder morphology and examine how these models relate to throwing performance. To address these questions, we collected skeletal measures from fossil and extant taxa, as well as anthropometric (N ¼ 36) and kinematic (N ¼ 27) data from Daasanach throwers from northwestern Kenya. Our data show that all H. erectus fossil clavicles fall within the normal range of modern human variation. We find that a commonly used metric for normalizing clavicle length, the claviculohumeral ratio, poorly predicts shoulder position on the torso. Furthermore, no significant relationship between clavicle length and any measure of throwing performance was found. These data support reconstructing the H. erectus shoulder as modern human-like, with a laterally facing glenoid, and suggest that the capacity for high speed throwing dates back nearly two million years.


Throwing, the Shoulder, and Human Evolution

John E Kuhn 1

Abstract

Throwing with accuracy and speed is a skill unique to humans. Throwing has many advantages and the ability to throw has likely been promoted through natural selection in the evolution of humans. There are many unsolved questions regarding the anatomy of the human shoulder. The purpose of this article is to review many of these mysteries and propose that the answer to these questions can be understood if one views the shoulder as a joint that has evolved to throw.

weaning-carnivory

Early weaning age highlights emergence of carnivory in human evolution

Impact of Carnivory on Human Development and Evolution Revealed by a New Unifying Model of Weaning in Mammals

Humans have an early weaning age (less than three years old) relative to other species, which is strongly associated with carnivory level. Some authors “highlight the emergence of carnivory as a process fundamentally determining human evolution” [44].


Impact of Carnivory on Human Development and Evolution Revealed by a New Unifying Model of Weaning in Mammals

Abstract

Our large brain, long life span and high fertility are key elements of human evolutionary success and are often thought to have evolved in interplay with tool use, carnivory and hunting. However, the specific impact of carnivory on human evolution, life history and development remains controversial. Here we show in quantitative terms that dietary profile is a key factor influencing time to weaning across a wide taxonomic range of mammals, including humans. In a model encompassing a total of 67 species and genera from 12 mammalian orders, adult brain mass and two dichotomous variables reflecting species differences regarding limb biomechanics and dietary profile, accounted for 75.5%, 10.3% and 3.4% of variance in time to weaning, respectively, together capturing 89.2% of total variance. Crucially, carnivory predicted the time point of early weaning in humans with remarkable precision, yielding a prediction error of less than 5% with a sample of forty-six human natural fertility societies as reference. Hence, carnivory appears to provide both a necessary and sufficient explanation as to why humans wean so much earlier than the great apes. While early weaning is regarded as essentially differentiating the genus Homo from the great apes, its timing seems to be determined by the same limited set of factors in humans as in mammals in general, despite some 90 million years of evolution. Our analysis emphasizes the high degree of similarity of relative time scales in mammalian development and life history across 67 genera from 12 mammalian orders and shows that the impact of carnivory on time to weaning in humans is quantifiable, and critical. Since early weaning yields shorter interbirth intervals and higher rates of reproduction, with profound effects on population dynamics, our findings highlight the emergence of carnivory as a process fundamentally determining human evolution.


From the Ape’s Dilemma to the Weanling’s Dilemma: Early Weaning and its Evolutionary Context – G.E. Kennedy - 2004

Abstract

Although humans have a longer period of infant dependency than other hominoids, human infants, in natural fertility societies, are weaned far earlier than any of the great apes: chimps and orangutans wean, on average, at about 5 and 7.7 years, respectively, while humans wean, on average, at about 2.5 years. Assuming that living great apes demonstrate the ancestral weaning pattern, modern humans display a derived pattern that requires explanation, particularly since earlier weaning may result in significant hazards for a child. Clearly, if selection had favored the survival of the child, humans would wean later like other hominoids; selection, then, favored some trait other than the child’s survival. It is argued here that our unique pattern of prolonged, early brain growth—the neurological basis for human intellectual ability—cannot be sustained much beyond one year by a human mother’s milk alone, and thus early weaning, when accompanied by supplementation with more nutritious adult foods, is vital to the ontogeny of our larger brain, despite the associated dangers. Therefore, the child’s intellectual development, rather than its survival, is the primary focus of selection. Consumption of more nutritious foods—derived from animal protein—increased by ca. 2.6 myr ago when a group of early hominins displayed two important behavioral shifts relative to ancestral forms: the recognition that a carcass represented a new and valuable food source—potentially larger than the usual hunted prey—and the use of stone tools to improve access to that food source. The shift in the hominin “prey image” to the carcass and the use of tools for butchery increased the amount of protein and calories available, irrespective of the local landscape. However, this shift brought hominins into competition with carnivores, increasing mortality among young adults and necessitating a number of social responses, such as alloparenting. The increased acquisition of meat ca. 2.6 Ma had significant effects on the later course of human evolution and may have initiated the origin of the genus Homo.

high-fat-preferred

Hunters prefer high fat prime adults.

Prey mortality profiles indicate that Early Pleistocene Homo at Olduvai was an ambush predator

Hunter-gatherer populations are known to abandon prey when they’re too lean [39]. Instead, they’ll target animals that appear fatter [40], especially large adults [41]


Prey mortality profiles indicate that Early Pleistocene Homo at Olduvai was an ambush predator

Abstract

The prime-adult-dominated mortality profile of large bovids in the 1.8 Ma FLK Zinj assemblage, Olduvai Gorge, Tanzania, was recently attributed to ambush hunting by early Homo (Bunn, H.T., Pickering, T.R. 2010. Quat. Res. 74, 395–404). We now investigate a logical follow-up question: is enough known about the causes and pervasiveness of prime-adult-dominated mortality profiles (defined as >70% prime adults) from modern ecosystems and from archaeological sites to warrant their attribution solely to hominin hunting? Besides hominin hunting, three methods of scavenging could have provided the large bovids butchered at FLK Zinj: first-access scavenging from non-predator-related accidents; late-access passive scavenging from lion (or other) kills; early-access aggressive scavenging from lion (or other) kills.

We present new data on hunted prey from Hadza bow hunting (e.g., N = 50 impala; N = 18 greater kudu) near Lake Eyasi, Tanzania, and from San bow hunting (N = 13 gemsbok) in the Kalahari Desert, Botswana, documenting non-selective, living-structure profiles. We present new data on drowned wildebeest (N = 175) from Lake Masek, in the Serengeti, documenting many prime adults but also a significantly high percentage of old adults, unlike the profile at FLK Zinj. We also examine mortality profiles from modern African lions and from Old World Pleistocene archaeological sites, revealing that while prime-dominated profiles are present in some archaeological assemblages, particularly some Late Pleistocene European sites involving cervids, they are not documented from lion or other larger carnivore predation; moreover, living-structure profiles with prime adults representing ∼50–60% of prey are common, particularly in African archaeological assemblages involving bovids hunted by humans. Although taphonomic bias, prey socioecology, and season of death may all influence mortality profiles, prime-dominated profiles require careful evaluation. The prime-dominated profile at FLK Zinj is significantly different from profiles formed by the three scavenging methods, which likely indicates hunting by Early Pleistocene Homo.

APOE4-gene

APOE4 gene

APOE4 gene may be evidence of strong carnivorous pressure on certain populations.

The APOE4 gene is involved in the metabolism of fat and cholesterol in addition to LDL receptor interactions. It underwent strong selective pressure as evidenced by 15 percent of the world’s population having it today and up to 40-50 percent in certain populations [33]. When eating a highly carnivorous diet—which could have a higher pathogen burden—APOE4 may help us retain cognitive functions [34].

amy1-gene

Recent genetic adaptations to starch consumption

Human AMY1 Gene

The evidence that points to an accumulated genetic adaptation to tuber consumption is rather recent on our evolutionary timescale [29]. It suggests tubers weren’t a big part of our diet previously [30]. The human AMY1 gene is expressed in many of our organs for the breakdown of starch and glycogen. Several studies have hypothesized that people with a low number of AMY1 copies would suffer from higher rates of obesity and diabetes on a high-starch diet. However, the evidence doesn’t support that [31, 32].


Human adaptations to diet, subsistence, and ecoregion are due to subtle shifts in allele frequency

Abstract

Human populations use a variety of subsistence strategies to exploit an exceptionally broad range of ecoregions and dietary components. These aspects of human environments have changed dramatically during human evolution, giving rise to new selective pressures. To understand the genetic basis of human adaptations, we combine population genetics data with ecological information to detect variants that increased in frequency in response to new selective pressures. Our approach detects SNPs that show concordant differences in allele frequencies across populations with respect to specific aspects of the environment. Genic and especially nonsynonymous SNPs are overrepresented among those most strongly correlated with environmental variables. This provides genome-wide evidence for selection due to changes in ecoregion, diet, and subsistence. We find particularly strong signals associated with polar ecoregions, with foraging, and with a diet rich in roots and tubers. Interestingly, several of the strongest signals overlap with those implicated in energy metabolism phenotypes from genome-wide association studies, including SNPs influencing glucose levels and susceptibility to type 2 diabetes. Furthermore, several pathways, including those of starch and sucrose metabolism, are enriched for strong signals of adaptations to a diet rich in roots and tubers, whereas signals associated with polar ecoregions are overrepresented in genes associated with energy metabolism pathways.

adipose-morphology

We also fit the typical carnivore pattern of adipocyte morphology, which means we have smaller and more numerous fat cells

Body mass and natural diet as determinants of the number and volume of adipocytes in eutherian mammals

We also fit the typical carnivore pattern of adipocyte morphology, which means we have smaller and more numerous fat cells [25].

Body mass and natural diet as determinants of the number and volume of adipocytes in eutherian mammals

Abstract

Total dissection of a randomly collected sample of 202 adult and subadult eutherian mammals, combined with site-specific adipocyte volume determination, shows that the number of adipocytes in the body is proportional to (Body Mass)0.74 for predominantly carnivorous species and to (Body Mass)0.78 for mainly herbivorous, nonruminant mammals. Adipocyte expansion or shrinkage, not proliferation or depletion of adipocyte number, is the principal mechanism of adipose tissue enlargement and reduction. Therefore, the adipocytes of large mammals are larger than those of smaller specimens of similar dietary habits and fatness. We suggest that the presence of more numerous, smaller adipocytes in smaller mammals is related to their higher mass-specific metabolic rate. The adipose tissue of mammals with a predominantly carnivorous diet contains 4.6 times as many adipocytes as that of herbivorous nonruminants of similar body mass; but nonruminant herbivores are not necessarily fatter because the adipocytes of carnivorous mammals are proportionately smaller than those of nonruminant herbivores. We suggest that a carbohydrate-based energy metabolism is associated with fewer, relatively larger adipocytes and that when lipids and proteins form the major dietary energy source, adipose tissue consists of a greater number of smaller adipocytes.

priority-storage

The fat we eat has priority storage in our subcutaneous fat tissue, even before other macronutrients do

Adipose tissue as a buffer for daily lipid flux

The fat we eat has priority storage in our subcutaneous fat tissue, even before other macronutrients do [24].

Adipose tissue as a buffer for daily lipid flux

Abstract

Insulin resistance occurs in obesity and Type II (non-insulin-dependent) diabetes mellitus, but it is also a prominent feature of lipodystrophy. Adipose tissue could play a crucial part in buffering the flux of fatty acids in the circulation in the postprandial period, analogous to the roles of the liver and skeletal muscle in buffering postprandial glucose fluxes. Adipose tissue provides its buffering action by suppressing the release of non-esterified fatty acids into the circulation and by increasing triacylglycerol clearance. In particular, the pathway of 'fatty acid trapping' (adipocyte uptake of fatty acids liberated from plasma triacylglycerol by lipoprotein lipase) could play a key part in the buffering process. If this buffering action is impaired, then extra-adipose tissues are exposed to excessive fluxes of lipid fuels and could accumulate these in the form of triacylglycerol, leading to insulin resistance. These tissues will include liver, skeletal muscle and the pancreatic beta cell, where the long term effect is to impair insulin secretion. Adipose tissue buffering of lipid fluxes is impaired in obesity through defects in the ability of adipose tissue to respond rapidly to the dynamic situation that occurs after meals. It is also impaired in lipodystrophy because there is not sufficient adipose tissue to provide the necessary buffering capacity. Thus, the phenotype, at least with regard to insulin resistance, is similar with both excess and deficiency of adipose tissue. Furthermore, this concept could provide a framework for understanding the action of the thiazolidinedione insulin-sensitizing agents.


long-limbs

Percentage of muscle distribution to upper and lower limbs

Humans devote more muscle to lower limbs showing less reliance on arboreal locomotion.

Percentage of muscle distribution to upper and lower limbs in Pongo pygmaeus, Gorilla gorilla, P. paniscus, and H. sapiens.


The regional distribution of muscle in H. sapiens contrasts significantly with the regional distribution of muscle in P. paniscus and other apes. Of total muscle mass in H. sapiens, more than half acts on the lower limbs and only a fifth acts on the upper limbs (36, 37), whereas a third of P. paniscus muscle acts on the upper limbs (Fig. 1).

fatty-humans

Humans are much fatter than other apes

Ability to store more fat shows humans depend on fat based metabolism typical of carnivores.

Male and female hunter-gatherers have an average body fat level of 9 percent and 24 percent, respectively, which is quite lean by modern standards [22]. But compared to primate bonobo males and females that have less than 1 percent and less than 4 percent body fat, respectively, humans are relatively fat [23]



Hunter‐gatherers as models in public health

Summary

Hunter‐gatherer populations are remarkable for their excellent metabolic and cardiovascular health and thus are often used as models in public health, in an effort to understand the root, evolutionary causes of non‐communicable diseases. Here, we review recent work on health, activity, energetics and diet among hunter‐gatherers and other small‐scale societies (e.g. subsistence farmers, horticulturalists and pastoralists), as well as recent fossil and archaeological discoveries, to provide a more comprehensive perspective on lifestyle and health in these populations. We supplement these analyses with new data from the Hadza, a hunter‐gatherer population in northern Tanzania. Longevity among small‐scale populations approaches that of industrialized populations, and metabolic and cardiovascular disease are rare. Obesity prevalence is very low (<5%), and mean body fat percentage is modest (women: 24–28%, men: 9–18%). Activity levels are high, exceeding 100 min d−1 of moderate and vigorous physical activity, but daily energy expenditures are similar to industrialized populations. Diets in hunter‐gatherer and other small‐scale societies tend to be less energy dense and richer in fibre and micronutrients than modern diets but are not invariably low carbohydrate as sometimes argued. A more integrative understanding of hunter‐gatherer health and lifestyle, including elements beyond diet and activity, will improve public health efforts in industrialized populations.


Body composition in Pan paniscus compared with Homo sapiens has implications for changes during human evolution

SIGNIFICANCE

During human evolution, the body changed in shape, partially to accommodate bipedal locomotion. Concurrently, brain size underwent a three-fold increase recorded in evidence from fossils and from comparative anatomy of chimpanzees, Homo sapiens’ closest living relatives. Because soft tissues like muscle, skin, and fat do not fossilize, and little information is available on these components for the genus Pan, reconstructing tissue changes has primarily relied on what is known about humans. This study presents unique quantitative data on major body components of muscle, bone, skin, and fat of 13 bonobos (Pan paniscus) for interpreting evolutionary forces that have shaped the human form for survival in a savanna mosaic environment.

Keywords: body composition, bonobo, Pan paniscus, human evolution, Homo sapiens


Implications for Human Evolution.

Body fat.

The negligible measurable fat in all seven P. paniscus males was unexpected, overriding captivity, age, and body mass. Among wild chimpanzees, there is little indication of an ability to mobilize fat stores during times of caloric restriction, a key adaptive feature found in orangutans and possibly to a lesser degree in gorillas (24, 52, 53). Without selection pressure for storage fat, and with over half of body mass in muscle, the male P. paniscus does not easily accumulate body fat, even under optimal circumstances of captivity. Remarkably, none of the males and females manifested detrimental health as a consequence of having little fat, in stark contrast to H. sapiens.

There is evidence in female P. paniscus that fluctuation in body fat is associated with reproductive history. Individual paniscus female 1 (PF1) with the most body fat was lactating at the time of death and had considerable fat in her breasts, trunk, and limbs. Individual paniscus female 4 (PF4) died within 1 wk after giving birth to a full-term offspring, her first, and had notable fat deposits in the trunk and limbs. Individual paniscus female 3 (PF3) had no offspring or pregnancies during her life, had remarkably low levels of dissectible fat, and stood out as having the highest percentage of muscle mass among the females (44.1%, Table 1). Body fat is also sexually dimorphic in some monkeys, as well as in gorillas and orangutans (2325), and has a demonstrated role in reproduction (5456). The same is true for H. sapiens (19, 4348).

In the course of human evolution from early australopithecines onward, the ability to store and mobilize body fat must have played an increasing role in successful reproduction (43, 44, 57, 58), coupled with the shift to bipedal locomotion (59). We posit that early australopithecine females, such as P. paniscus, put on more body fat than the males and had the ability to vary their adiposity with reproductive cycles. We conclude that body fat was sexually dimorphic in australopithecines, as it is for P. paniscus, variable in females but consistently low in males: as high as 8–10% in females and 2–3% in males. Although Wells (19) suggests that both female and male australopithecines had 11–13% body fat, our data suggest, to the contrary, it is unlikely that male australopithecines approached the level of body fat found in male Hazda hunter-gatherers.

It may be that in early Homo and H. erectus, with the increase in brain size (60) and body mass (61), fat began to play a more significant role in female reproduction. As the early Homo species expanded out of Africa, the ability to put on, store, and mobilize body fat provided a clear adaptive advantage for both females and males as a backup against extremes of food availability in a variety of environments, as well as mitigating adverse effects on pregnancy and lactation in females. Although body fat for males may not be as critical for reproduction as it is for females, we suspect that male H. erectus would have needed an estimated 7–8% body fat as a buffer against “seasonal hunger” (43, 45, 46). In order for females of ancestral species of Homo to nourish larger brained infants without the extended developmental period characteristic for H. sapiens (62), they probably needed additional body fat, in the range of 12–14%.

Increased body fat during human evolution reflected more than one selective pressure. Females experienced increased nutritional demands for successful reproduction and for maintenance of a high level of activity, walking several miles a day collecting and carrying food and dependent infants (cf. 44). These needs were compounded by pronounced seasonal variation in food sources characteristic of the tropical savanna mosaic (63) and the unpredictability of new environments as early Homo species expanded their home ranges regionally and geographically.


Our larger fat reserves are original to us, in the sense that they’re not a trait held by our group’s last common ancestor [26]. Our other energy source, carbohydrate, is stored in amounts (400-500 grams) about 10 times smaller than fat.

carnivore-connection

Carnivore Connection of Insulin Resistance

The 'carnivore connection'--evolutionary aspects of insulin resistance

The 'carnivore connection'--evolutionary aspects of insulin resistance

Abstract

Insulin resistance is common and is determined by physiological (aging, physical fitness), pathological (obesity) and genetic factors. The metabolic compensatory response to insulin resistance is hyperinsulinaemia, the primary purpose of which is to maintain normal glucose tolerance. The 'carnivore connection' postulates a critical role for the quantity of dietary protein and carbohydrate and the change in the glycaemic index of dietary carbohydrate in the evolution of insulin resistance and hyperinsulinaemia. Insulin resistance offered survival and reproductive advantages during the Ice Ages which dominated human evolution, during which a high-protein low-carbohydrate diet was consumed. Following the end of the last Ice Age and the advent of agriculture, dietary carbohydrate increased. Although this resulted in a sharp increase in the quantity of carbohydrate consumed, these traditional carbohydrate foods had a low glycaemic index and produced only modest increases in plasma insulin. The industrial revolution changed the quality of dietary carbohydrate. The milling of cereals made starch more digestible and postprandial glycaemic and insulin responses increased 2-3 fold compared with coarsely ground flour or whole grains. This combination of insulin resistance and hyperinsulinaemia is a common feature of many modern day diseases. Over the last 50 y the explosion of convenience and takeaway 'fast foods' has exposed most populations to caloric intakes far in excess of daily energy requirements and the resulting obesity has been a major factor in increasing the prevalence of insulin resistance.

In ketosis, adaptive glucose sparing (i.e. physiological insulin resistance) occurs and is typical of carnivores

preservation-of-muscle-mass

Ketosis prevents muscle wasting in high fat carnivores

Very-low-carbohydrate diets and preservation of muscle mass

Very-low-carbohydrate diets and preservation of muscle mass

This commentary provides some basic information on metabolic adaptations that lead to sparing of muscle protein during a VLCARB, and reviews studies examining the effects of VLCARB interventions on body composition.

Metabolic adaptations in VLCARB

It is frequently claimed that a VLCARB sets the stage for a significant loss of muscle mass as the body recruits amino acids from muscle protein to maintain blood glucose via gluconeogenesis. It is true that animals share the metabolic deficiency of the total (or almost total) inability to convert fatty acids to glucose [18]. Thus, the primary source for a substrate for gluconeogenesis is amino acid, with some help from glycerol from fat tissue triglycerides. However, when the rate of mobilization of fatty acids from fat tissue is accelerated, as, for example, during a VLCARB, the liver produces ketone bodies. The liver cannot utilize ketone bodies and thus, they flow from the liver to extra-hepatic tissues (e.g., brain, muscle) for use as a fuel. Simply stated, ketone body metabolism by the brain displaces glucose utilization and thus spares muscle mass. In other words, the brain derives energy from storage fat during a VLCARB.

Glycolytic cells and tissues (e.g., erythrocytes, renal medulla) will still need some glucose, because they do not have aerobic oxidative capacity and thus cannot use ketone bodies. However, glycolysis in these tissues leads to the release of lactate that is returned to the liver and then reconverted into glucose (the Cori cycle). Energy for this process comes from the increased oxidation of fatty acids in the liver. Thus, glycolytic tissues indirectly also run on energy derived from the fat stores.

The hormonal changes associated with a VLCARB include a reduction in the circulating levels of insulin along with increased levels of glucagon. Insulin has many actions, the most well-known of which is stimulation of glucose and amino acid uptake from the blood to various tissues. This is coupled with stimulation of anabolic processes such as protein, glycogen and fat synthesis. Glucagon has opposing effects, causing the release of glucose from glycogen and stimulation of gluconeogenesis and fat mobilization. Thus, the net stimulus would seem to be for increasing muscle protein breakdown. However, a number of studies indicate that a VLCARB results in body composition changes that favour loss of fat mass and preservation in muscle mass.


How is the preservation of muscle mass brought about during a VLCARB?

There are at least four possible mechanisms:

Adrenergic stimulation

The increase in adrenaline may be involved. Low blood sugar is a potent stimulus to adrenaline secretion and it is now clear that skeletal muscle protein mass is also regulated by adrenergic influences. For example, Kadowaki et al. demonstrated that adrenaline directly inhibits proteolysis of skeletal muscle [6].

Ketone bodies

As noted above, the liver produces ketone bodies during a VLCARB and they flow from the liver to extra-hepatic tissues (e.g., brain, muscle) for use as a fuel. In addition, ketone bodies exert a restraining influence on muscle protein breakdown. If the muscle is plentifully supplied with other substrates for oxidation (such as fatty acids and ketone bodies, in this case), then the oxidation of muscle protein-derived amino acids is suppressed. Nair et al. reported that beta-hydroxybutyrate (beta-OHB, a major ketone body) decreases leucine oxidation and promotes protein synthesis in humans [7]. Although blood concentrations of beta-OHB in their subjects during the infusion of beta-OHB were much lower than concentrations observed in humans during fasting, leucine incorporation into skeletal muscle showed a significant increase (5 to 17%).

Growth hormone (GH)

GH has a major role in regulating growth and development. GH is a protein anabolic hormone and it stimulates muscle protein synthesis. As low blood sugar increases GH secretions, one could speculate that a VLCARB increases GH levels. However, Harber et al. reported that GH secretion was unchanged with 7-day VLCARB/high-protein diet [8]. Interestingly, they also observed that skeletal muscle expression of IGF-I mRNA increased about 2-fold. A plausible explanation for the increased expression of IGF-I in muscle is the increased availability of dietary protein.

Dietary protein

A VLCARB is almost always relatively high in protein. There is evidence that high protein intake increases protein synthesis by increasing systemic amino acid availability [21], which is a potent stimulus of muscle protein synthesis [22]. During weight loss, higher protein intake reduces loss of muscle mass and increases loss of body fat [9]. It has been proposed that the branched-chain amino acid leucine interacts with the insulin signaling pathway to stimulate downstream control of protein synthesis, resulting in maintenance of muscle mass during periods of restricted energy intake [10]. A recent study by Harber et al. reported that a VLCARB/high-protein diet increases skeletal muscle protein synthesis despite a dramatic reduction in insulin levels [8].

Go to:

Conclusion

Although more long-term studies are needed before a firm conclusion can be drawn, it appears, from most literature studied, that a VLCARB is, if anything, protective against muscle protein catabolism during energy restriction, provided that it contains adequate amounts of protein.


How Ketones Spare Protein in Starvation

Abstract

An infusion of β-hydroxybutyrate decreased amino acid oxidation and increased protein synthesis in humans.

We expect high fat megafauna hunters to preserve their muscles by eating enough protein and being in constant ketosis. 

the-intestines

Intestines - general theory

Summary of evolutionary changes in the human gut

https://www.frontiersin.org/articles/10.3389/fevo.2020.00025/full?utm_source=fweb&utm_medium=nblog&utm_campaign=ba-sci-fevo-early-hominin-microbiome

The Intestines

At some point in the last six million years, in addition to the potential changes in stomach acidity, the guts of our ancestors changed in other ways. The large intestine became shorter relative to the small intestine, while total intestine length also declined relative to body size. That this shift and shortening happened is suggested based on comparisons between the guts of chimpanzees, bonobos, and humans as well as the relatively smaller rib cage (and hence space available for the intestines) in the genus Homo compared to earlier hominin species (Aiello and Wheeler, 1995). However, it is worth noting that even within humans that the length of the large intestine varies even among individuals with similar genetic backgrounds. In one study of one hundred individuals, the shortest small intestine observed in any individual was half the length of the longest small intestine. Similarly, the ratio of small intestine to large intestine varied from 2.6 to 4.5. Given that gut morphology differs within populations of modern humans, it is possible (indeed likely) that variation among modern human populations is even greater (Underhill, 1955). To date no studies have considered such variation. The mean ratio of the small to large intestine length for chimpanzees is 1.0 (such that the chimpanzee large intestine is equal in length, on average) to the small intestine (Chivers and Hladik, 1984). But undoubtedly this value varies among chimpanzees as well, such that it is not inconceivable that some human populations and some chimpanzee populations actually have far more similar gut morphologies than tends to be assumed.

The shortening in the relative size of the human large intestine, whatever its consistency and magnitude, raises two questions: why the shortening occurred and what its consequences might have been for digestive physiology and the gut microbiome. In general there seems to be an emerging consensus that the use of tools, especially stick and stone kitchen tools of various sorts, to obtain and process foods made our ancestors less reliant on the fermentation that occurs in the large intestine. Cooking is likely one of the tools that our ancestors had at their disposal. Recent work has shown that cooking plant food reshapes the gut microbial environment (Carmody et al., 2019), suggesting that the use of fire, despite mixed evidence for its impact on starch digestibility (Schnorr et al., 2016), may have made nutrients in some types of food more available and also eased the chewing necessary to break down food (Wrangham, 2009). Fire may have also made it possible to smoke hives and therefore easier to harvest large quantities of honey with its easy to digest calories (which do not necessarily require gut microbes; Marlowe et al., 2014). In addition, fishing techniques and tools might have made fish and shellfish protein available which, even raw, is very easy to digest. Pounding tools, such as those employed by chimpanzees, would have made roots and tubers also easier to digest (Crittenden, 2016). Similar tools are used by many small-scale societies around the world, including contemporary subsistence foragers (Benito-Calvo et al., 2018) as well as by chimpanzees (and hence likely our LCA; Figure 2). All of this is to say that as our ancestors invented more kitchen implements they would have been able to pre-digest and pre-process some of their foods, allowing them to rely less on microbes in their guts to break down recalcitrant components of their diets, such as cellulose. They could get by with smaller guts and invest their bodily energy elsewhere, for example in big brains (an idea called the expensive tissue hypothesis; Aiello and Wheeler, 1995).

The shorter average large intestine length of species of Homo compared to those of their ancestors would have had at least two potential consequences for microbiomes. The shorter larger intestine would have sustained a smaller biomass of microbes relative to their body mass (simply because of the reduction in volume). In addition, the retention time of foods in the gut may have been reduced (Ragir et al., 2000). Some features of microbiomes, however, seem likely to have been similar between hominins and our LCA with chimpanzees despite changes in gross intestinal morphology. For example, the taxonomic classes of bacteria found in the guts of both chimpanzees and humans (from urban and rural settings) tend to overlap. What is more, the same families and genera of bacteria tend to occur in similar proportions (Moeller et al., 2012). This overlap is hypothesized to pre-date the human-chimpanzee split (and hence to be characteristic of our LCA). Furthermore, humans in small-scale, non-industrialized populations host a handful of microbial taxa that appear to be genetically equivalent to those in great apes at the level of operational taxanomic units (OTUs) or strains (Amato et al., 2019b). The same humans also share a range of bacterial metabolic pathways with other extant apes, including those involved in vitamin and amino acid synthesis. These results suggest that despite the reduction in length of the human intestines, enough physiological similarities remain between humans and apes such that the composition and function of their microbiomes is similar.

Nevertheless, despite these similarities, it is important to point out that the gut microbiomes of modern humans diverge in important ways from those of extant apes. These differences do not, however, appear to relate to gross morphological features of the gut but instead to diet. The gut microbiomes of humans, while similar to those of modern chimpanzees, appear to be even more similar to those of cercopithecine monkeys, such as baboons (genus Papio; Amato et al., 2019b; Figure 3). Differences in gut microbiome composition are greater between humans and apes (PERMANOVA F1,55 = 14.4, r2 = 0.21, p < 0.01) than between humans and cercopithecines (PERMANOVA F1,57 = 10.0, r2 = 0.15, p < 0.01). Differences in gut microbiome functional potential are similar between humans and apes (PERMANOVA F1,35 = 5.4, r2 = 0.16, p < 0.01) and humans and cercopithecines (PERMANOVA F1,35 = 7.4, r2 = 0.18, p < 0.01). While humans are genetically far more similar to chimpanzees than to baboons, baboons are more similar in diet (and habitat use) to ancestral Homo species than are chimpanzees. Baboons eat diets that are highly omnivorous and relatively high in starch content. Since the gut microbiome plays an important role in processing host dietary compounds, particularly resistant carbohydrates (and in some cases, specifically fibrous plant foods, see Schnorr et al., 2014) it is likely that the same microbial lineages and metabolic pathways nutritionally benefited both our hominin ancestors and extant cercopithecines. Given that the human shift toward habitats and diets like those of modern baboons are often linked to tool use, cooking, and ultimately, reductions in human intestinal length, it seems reasonable to suggest that this suite of changes altered the human gut microbiome. The result appears to be a “characteristic” human microbiome composed of both “ape” and “cercopithecine” traits.

I think this section could be written to favor FC much more strongly.


I'm curious - what are the benefits of the microbiome? Create necessary vitamins and nutrients and short chain fatty acids? Reduce inflammation? Process toxic compounds from diet? Prevent intestinal permeability?


I'd like to see the microbiome compared with more omnivores and carnivores - and see if ape microbiomes also change with all-meat-based diets as we would expect they would - but not as much to some beneficial degree as humans. 


I'd be especially interested to see if all-meat diets 'ruin' ape guts or create some sort of chronic disease. That would mean humans evolved to handle more meat. 

bottom of page