Why are humans ecologically dominant?

Part Of: Demystifying Culture sequence
Content Summary: 1100 words, 11 min read

Ecological Dominance

Compared to the erects, sapiens are uniquely ecologically dominant. The emergence of hunter-gatherers out of Africa 70,000 years ago caused:

  • The extermination of hundreds of megafauna species (more than 90%)
  • Dwarfing of the surviving species.
  • A huge increase in the frequency and impact of fire (we used fire to reshape ecosystems to our liking)

12,000 years ago, we began domesticating animals and plants. The subsequent agricultural revolution unlocked powerful new ways to acquire energy, which in turn increased our species’ population density.

  • 9000 BCE:   5 million people
  •          1 CE:   300 million people
  •   2100 CE:   11,000 million people

200 years ago, the industrial revolution was heralded by the discovery of energy transduction: that electricity can be used to run a vacuum, or freeze meat products.

These population explosion correlates with a hefty ecological footprint:

  • We have altered more than one-third of the earth’s land surface.
  • We have changed the flow of two-thirds of the earth’s rivers.
  • We use 100 times more biomass than any large species that has ever lived.
  • If you include our vast herds of domesticated animals, we account for more than 98% of terrestrial vertebrate biomass.

Ecological Dominance_ Vertebrate Biomass

Three Kinds of Theories

As with any other species, the scientist must explain how ours has affected the ecosystem. We can do this by examining how our anatomies and psychologies differ from other animals, and then consider which of these human universals explain our ecological dominance.

Pound for pound, other primates are approximately twice as strong. We also lack the anatomical weaponry of our cousins; for example, our canines are much less dangerous.

So, strength cannot explain our dominance. Three other candidate theories tend to recur:

  1. We are more intelligent and creative. Theories of this sort focus on e.g., the invention of Mode 3 stone tools.
  2. We are more cooperative and prosocial. Theories of this sort focus on e.g., massively cooperative hunting expeditions.
  3. We accumulate powerful cultural adaptations. Theories of this sort focus on e.g., how Inuit technology became uniquely adaptive for their environment.

Let’s take a closer look!

Intelligence-Based Theories

Is intellect the secret for our success? Consider the following theories:

First, generative linguists like Noam Chomsky argue that language is not about communication: recursion is an entirely different means of cognition; the root of our species’ creativity. To him, the language instinct (as a genetic package) appeared abruptly at 70 kya, and transformed the mind from a kluge of instincts to a mathematical, general-purpose processor. Language evolution is said to coincide with the explosion of technology called behavioral modernity.

Second, evolutionary psychologists like Leda Cosmides & John Tooby advocate the massive modularity hypothesis: the mind isn’t general purpose processor; it is instead more like a swiss army knife. We are not more intelligent because we have fewer instincts, but more. Specifically, we accrued hundreds of hunter-gatherer instincts in the intervening millenia and these instincts give us our characteristically human flexibility.

Third, social anthropologists like David Lewis-Williams argues that a change in consciousness made us more intelligent. We are the only species that has animistic spirituality, these are caused by numinous experiences. These altered states of consciousness were the byproducts of our consciousness machinery rearranging itself. Specifically, he invokes Dehaene’s theory that while all mammals experience primary consciousness, only sapiens have second-order consciousness (awareness of their own awareness). This was allegedly the event that caused fully modern language.

Sociality-Based Theories

Is sociality the secret for our success? Consider the following theories:

First, sociobiologists like Edward O Wilson thinks that the secret of our success is because of group selection: that vigorous between-group warfare created selective pressure for within-group cooperation. As our ethnic psychology (and specifically, ethnocentrism) became more pronounced, sapien tribes began behaving much like superorganisms. A useful analogy is eusocial insects like ants, who became are arguably even more ecologically dominant than humans.

Second, historians like Yuval Harari thinks that mythology (fictional orders) is the key ingredient enabling humans to act cooperatively. Political and economic phenomena don’t happen in a vacuum: they are caused by certain ideological commitments e.g., nationalism and the value of a currency. To change our myths is to refactor the social structure of our society.

Culture-Based Theories

Is culture the secret for our success? Consider the following theory:

Anthropologists like Richerson, Boyd and Henrich argue that cumulative cultural knowledge comprises a dual-inheritance system, and propose a theory of gene-culture coevolution. They are that an expanding collective mind gave individuals access to unparalleled know-how. This is turn emboldened our niche stealing proclivities: “like the spiders, hominins could trap, snare, or jet their prey; but the latter could also ambush, excavate, expose, entice, corral, hook, spear, preserve, or contain a steadily enlarging range of food types.” Socially-learned norms induce our cooperation, and socially-learned thinking tools explain our intelligence.

My Take

Contra Chomsky,

Contra Cosmides & Tooby:

  • I agree wholeheartedly with the massive modularity hypothesis. It accords well with modern cognitive neuroscience.
  • While selection endowed us with hunter-gatherer instincts (e.g., folk biology), I don’t think such instincts provide sufficient explanatory power.

Contra David Lewis-WIlliams:

  • I need hard evidence showing that animals never hallucinate, before appropriating numinous experiences as a human universal.
  • Global Workspace Theory (GWT) enjoys better empirical support than integrated information theory.
  • I don’t understand the selective pressure or mechanistic implications for changes to our conscious machinery.

Contra sociality-first theories

  • Group selection is still immersed in controversy, especially the free-rider problem.
  • Why must myths be the causal first movers? Surely other factors matter more..

My own thinking most closely aligns with culture-based explanations of our ecological dominance. This sequence will try to explicate this culture-first view.

But at present, culture-first theories leaves several questions unanswered:

  • What, specifically, is the behavioral and biological signature of a social norm? For now, appeals to norm psychology risk explaining too much.
  • How did our species (and our species alone) become psychologically equipped to generate cumulative culture?
  • If erectus was a cultural creature, why did the rate of technological innovation so dramatically change between erectus and sapiens?

Someday I hope to explore these questions too. Until then.

References

  1. Tim Flannery. The Future Eaters
  2. David Lewis-Williams. The Mind in The Cave.
  3. Yuval Harari. Sapiens.
  4. Henrich, The Secret of Our Success

The Evolution of Faith

Part Of: Demystifying Culture sequence
Content Summary: 1200 words, 12 min read

Context

Recall that human beings have two different vehicles for learning:

  • Individual Learning: using personal experiences to refine behavioral techniques, and build causal models of how the world works.
  • Social Learning: using social interactions to learn what other people have learned.

Today, we will try to explain the following observations:

  • Most cultural traditions have adaptive value.
  • This value typically cannot be articulated by practitioners.

Why should this be the case?

Example 1: Manioc Detoxification

Consider an example of food preparation, provided by Joseph Henrich:

In the Colombian Amazon, a starchy tuber called manioc has lots of nutritional value, but also releases hydrogen cyanide when consumed. If eaten unprocessed, manioc can cause chronic cyanide poisoning. Because it emerges only gradually after years of consuming manioc that tastes fine, chronic poisoning is particularly insidious, and has been linked to neurological problems, paralysis of the legs, thyroid problems, and immune suppression.

Indigenous Tukanoans use a multistep, multiday processing technique that involves scraping, grating, and finally washing the roots in order to separate the fiber, starch, and liquid. Once separated, the liquid is boiled into a beverage, but the fiber and starch must then sit for two more days, when they can then be baked and eaten. Chemical analyses confirm that each major step in the processing is necessary to remove cyanogenic content from the root. [5]

Yet consider the point of view of a woman learning such techniques. She may never have seen anyone get cyanide poisoning, because the techniques work. And she would be required to spend about four hours per day detoxifying manioc. [4]

Consider what might result if a self-reliant Tukanoan mother decided to drop seemingly unnecessary steps from the processing of her bitter manioc. She might critically examine the procedure handed down to her from earlier generations and conclude that the goal of the procedure is to remove the bitter taste. She would quickly find that with the much less labor-intensive process of boiling, she could remove the bitter taste. Only decades later her family would begin to develop the symptoms of chronic cyanide poisoning.

Here, the willingness of the mother to take on faith received cultural practices is the only thing preventing the early death of her family. Individual learning does not pay here; after all, it can take decades for the effects of the poison to manifest. Manioc processing is causally opaque.

The detoxification of dozens of other food products (corn, nardoo, etc) are similarly inscrutable. In fact, history is littered with examples of European explorers imperfectly copying indigenous food processing techniques, and meeting gruesome ends.

Example 2: Pregnancy Taboos

Another example, again from Henrich:

During pregnancy and breastfeeding, women on Fiji adhere to a series of food taboos that selectively excise the most toxic marine species from their diet. These large marine species, which include moray eels, barracuda, sharks, rock cod, and several large species of grouper, contribute substantially to the diet in these communities; but all are known in the medical literature to be associated with ciguatera poisoning.

This set of taboos represents a cultural adaptation that selectively targets the most toxic species in women’s usual diets, just when mothers and their offspring are most susceptible. [2] To explore how this cultural adaptation emerged, we studied both how women acquire these taboos and what kind of causal understandings they possess. Fijian women use cues of age, knowledge, and prestige to figure out from whom to learn their taboos. [3] Such selectivity alone is capable of generating an adaptive repertoire over generations, without anyone understanding anything.

We also looked for a shared underlying mental model of why one would not eat these marine species during pregnancy or breastfeeding: a causal model or set of reasoned principles. Unlike the highly consistent answers on what not to eat and when, women’s responses to our why questions were all over the map. Many women simply said they did not know and clearly thought it was an odd question. Others said it was “custom.” Some did suggest that the consumption of some of the species might result in harmful effects to the fetus, but what precisely would happen to the fetus varied greatly: many women explained that babies would be born with rough skin if sharks were eaten and smelly joints if morrays were eaten.  

These answers are blatant rationalizations: “since I’m being asked for a reason, let me try to think one up now”.  The rationale for a taboo is not perceived by its adherents. This is yet another example of competence without comprehension.

A Theory of Overimitation

Human beings exhibit overimitation: a willingness to adopt complex practices even if many individual steps are inscrutable. Overimitation requires faith, defined here as a willingness to accept information in the absence of (or even contrasting with) your personal causal model.

We have replicated this phenomenon in the laboratory. First, present a puzzle box to a child, equipped with several switches, levers, and pulleys. Then have an adult teach the child how to open the box and get the treat inside. If the “solution” involves several useless procedures e.g., tapping the box with a stick three times, humans will imitate the entire procedure. In contrast, chimpanzees ignore the noise, and zoom in on the causally efficacious steps.

Why should chimpanzees outperform humans in this experiment? Chimpanzees don’t share our penchant for mimicry. Chimpanzees are not gullible by default. They must try to parse the relevant factors using the gray matter between their ears.

Humans fare poorly in such tests, because these opaque practices are in fact useless. But more often in our prehistory, inscrutable practices are nevertheless valuable. We are born to go with the flow.

In a species with cumulative culture, and only in such a species, faith in one’s cultural inheritance often yields greater survival and reproduction.

Is Culture Adaptive? Mostly.

We humans do not spend much time inspecting the content of our cultural inheritance. We blindly copy it. How then can cultural practices be adaptive?

For the same reason that natural selection produces increasingly sophisticated body plans. Communities with effective cultural practices outcompete their neighbors.

Overimitation serves to bind cultural practices together into holistic traditions. This makes another analogy to natural selection apt:

  • Genes don’t die, genomes die. Natural selection transmits an error signal for an entire genetic package.
  • Memes don’t die, traditions die. Cultural selection transmits an error signal for an entire cultural package.

Just as genomes can host individual parasitic elements (e.g., transposons), so too cultural traditions can contain maladaptive practices (e.g., dangerous bodily modifications). As long as the entire cultural tradition is adaptive, dangerous ideas can persist undetected in a particular culture.

Does Reason Matter? Yes.

So far, this post has been descriptive. It tries to explain why sapiens are prone to overimitation, and why faith is an adaptation.

Yet individual learning matters. Without it, culture would replicate but not improve. Reason is the fuel of innovation. We pay attention to intelligent, innovative people because of another cultural adaptation: prestige.

Perhaps the powers of the lone intellect are less stupendous than you were brought up to believe.

But we need not be slaves to neither our cultural nor our genetic inheritance. We can do better.

Related Resources

  1. Henrich (2016). The Secret Of Our Success.
  2. Henrich & Henrich (2010). The evolution of cultural adaptations: Fijian food taboos protect against dangerous marine toxins
  3. Henrich & Broesch (2011). On the nature of cultural transmission networks: evidence from Fijian villages for adaptive learning biases
  4. Dufour (1984). The time and energy expenditure of indigenous women horticulturalists in the northwest Amazon. 
  5. Dufour (1994). Cassave in Amazonia: Lessons in utilization and safety from native peoples. 

The Cursorial Ape: a theory of human anatomy

Part Of: Anthropogeny sequence
Followup To: The Walking Ape
Content Summary: 2100 words, 21 min read

A Brief Review of Human Evolution

The most recent common ancestor of humans and chimpanzees lived 7 mya (million years ago). The very first unique hominin feature to evolve was bipedality, which was an adaptation for squat-feeding. The australopiths were bipedal apes. They could walk comfortably, but retained their adaptations for tree living as well. Dental morphology and microwear together suggest that australopiths acquired food from a new source: tubers (the underground storage organs of plants).

Climate change is responsible for the demise of the australopiths. Africa began drying out about 3 million years ago, making the woodlands a harsher and less productive place to live. Desertification would have reduced the wetlands where australopiths found fruits, seeds, and underwater roots. The descendents of Australopithecus had to adapt their diet.

The paranthropes adapted by promoting tubers from backup to primary food. These impressive creatures comprise a blend of human and cow-like features. In contrast, the habilines (e.g., Homo Habilis) took a different strategy: meat eating. These creatures had the same small bodies, but larger brains. Their hands show adaptations for flexibility, and their shoulders and elbows for throwing missiles. They began making stone tools (Mode 1 tools, the Oldowan industry). They presumably used these anatomical and cultural gifts to compete with other scavengers on the savannah (projectiles to repulse competitors, stone flakes to speedily butcher a carcass).

The habilines in turn gave rise to

  • [1.9 mya] The erects (H erectus)  with near-modern anatomies.
  • [0.9 mya] The archaics (H heidelbergensis) appear, who eventually give rise to the Neanderthals, Denisovans, and us.
  • [0.3 mya] The moderns (H sapiens) emerge out of Africa, and completely conquer the globe.

A Closer Look

Yes, humans are apes. But why do we look so different from our closest living relative, the chimpanzee?

I have previously explained why we are bipedal (flexible waist, straight backs, walking on two feet).

But why do we have scent glands in our armpits? Fat in our asses? Such weird hair? Hairless skin with massive subcutaneous fat deposits?

Most of these changes were introduced with Homo Erectus:

Born To Run_ Hominin Anatomy (4)

Natural selection explains why bodies change. Anatomical innovations are selected when they enable more efficient exploitation of some particular niche.

So what ecological niche forged the modern human body?

Where Homo Erectus Evolved

The australopiths never made it beyond the southern margins of the Sahara. Because the adaptation of equatorial species inhibits their colonization of temperate regions, the successful emigration of the erects out of Africa strongly suggests that this was a northern, not a tropical species.

To evolve adaptations to dry, open country, the erects would have had to suffer a period of isolation from other hominins, in an appropriately discrete habitat. There were few, perhaps no, places in tropical or Southern Africa that could have provided such a combination. Comparing these constraints with the distribution of Homo Erectus fossils, comparative zoologist Jonathan Kingdon submits there the two most plausible contenders where the erects could have evolved are the Atlas Mountains, or Arabia.

Nasal evidence corroborates the hypothesis that they evolved in a desert environment. The entry to the primate nasal passage is flat, with straightforward air intake. Erect skulls show the first evidence of a protruding nose. A protruding nose forces the air at a “right angle” before entering the nasal cavity.

One of the responsibility of the nasal passage is to humidify the air before it is passed to the lungs. The increase in room and turbulence serves to amplify the humidification of inhaled air. Our noses are adaptations for desert living.

A New Thermoregulation System

There are two things unique to human skin:

  • Functional hairlessness. We modern humans have hair, but it is so thin compared to chimpanzees that we are effectively hairless.
  • Eccrine sweat glands. Our skin also contains a novel approach to sweat glands.

These two features are linked: we now know in exquisite molecular detail how incipient hair follicles are converted into eccrine glands (Lu et al 2016).

Other primates rely on oil-based apocrine sweat glands. The emergence of water-based eccrine glands in humans led to the “retirement” of apocrine glands in our lineage. The distribution of odor-producing apocrine glands was ultimately confined to our underarms and pubic regions.

Born To Run_ Sweat Glands (1)

Losing our hair had two important side-effects:

  • Skin pigmentation. Fur protects against ultraviolet radiation. Without it, melanin was used as an alternate form of natural sunscreen.
    • Why do otherwise-bald humans have hair at the tops of their heads? This is the location of maximal radiation.
    • Why didn’t all humans remain dark-skinned? Melanin also inhibits the skin’s production of Vitamin D, and different locales have different radiation levels, requiring new tradeoffs to be struck.
  • Subcutaneous fat. Ever seen a hairless chimpanzee? Human skin is much less wrinkled than other skin. Why? Even in non-obese people, humans store more of their body fat below the skin (versus in the abdomen, or between the muscles). This change has three complementary causes:
    1. carnivores tend to store fat in this way,
    2. mitigate the hernia risk associated with bipedality
    3. replace the insulation services of fur, without interfering with sweat system.

We have reviewed four changes in human skin. Rather than a discrete event, these changes presumably evolved gradually, and in tandem.

Born To Run_ Evolution of Skin (2)

Yes, but why are we hairless? There are many competing theories.

Jonathan Kingdon claims these skin adaptations arose late, as a parasite avoidance mechanism induced by increased population densities. Two rationales are provided: hair is a potent vector of infection, and the eccrine sweat system also has antibiotic properties.

This interpretation is challenged by genetic evidence that shows hominins were naked at least 1.2 mya, if not earlier (Rogers et al, 2004).

However, given the evidence suggesting Homo Erectus evolved in a desert climate, the most parsimonious theory seems to involve thermoregulation. We were exposed to less direct radiation given our upright posture; fur no longer served as critical of a role. But the overall climate was warm and dry,  

Humans as Cursorial Species

A cursorial animal is one that is adapted for long-distance running, rather than animals with high acceleration over short distances; thus, a leopard is considered cursorial, while a cheetah is not. Other examples include wolves, horses, and ostriches.

Fit human amateurs can regularly run 10 kilometers, and longer distances such as marathons (42 kilometers) are achieved by tens of thousands of people each year. Such distances are unknown if not impossible for any other primate, but are comparable to those observed in specialized mammalian cursors on open habitats. African hunting dogs, for example, travel an average 10km per day.

Racing horses can gallop 10 kilometers at 9 meters per second. However, the sustainable galloping speeds in horses decline considerably for runs longer than 10-15 minutes. Well-conditioned human runners exceed the predicted preferred galloping speed for a 65-kg quadruped, and can even occasionally outrun horses over extremely long distances.

Thus, despite our embarrassingly slow sprinting speed, human beings can outcompete even cursorial animals at endurance running over large distances. How come? The answer has to do with our unique cooling system.

When other mammals trot, they cool themselves by panting. However, above certain speeds a quadruped transitions to a full gallop, which precludes panting. A horse can trot all day, but it cannot gallop continuously without overheating.

Human adaptations for running, and our unique eccrine sweat-based cooling system, meant that humans have a larger trot/gallop (jog/sprint) transition threshold. Our superior cooling technology is accentuated in high heat. We are literally the only mammal that can run a marathon in high heat.

Born To Run_ Trot-Gallup Transition (1)

Why are we Born to Run?

Our bodies are designed for endurance running. We are cursorial animals. But why?

To achieve this, hominids exploited a new form of predation called persistence hunting. The most successful persistence hunts will involve:

  • Time: middle of the day (during peak heat)
  • Target: big prey (overheats faster)

If you chase a big animal above its trot/gallop transition speed, the animal will easily distance itself and begin panting. But you can track the animal, and chase it again before it has the opportunity to fully recover. Repeat this process, and after 10-25 km you will successfully drive the prey into hyperthermia. This style of hunting has a remarkable 75% success rate. Modern hunters typically prefer to use the bow and arrow, but persistence hunting is still in their repertoire. Before the invention of projectile weapons some 71 kya, persistence hunting surely played a larger role.

 

We know that habilines ate meat (many bones show signs of their butchery). But they likely acquired meat by scavenging, as they were not particularly effective carnivores. Their adaptations for projectiles were presumably used to repulse competitors, and stone tools certainly helped speedily butcher a carcass.

Of the dozens of running adaptations in our Homo Erectus, a substantial fraction already exist in habilines. Presumably the re-invention of our skin had begun too. These processes presumably began for simple reasons (it pays to move quickly, and have less fur, in the savannahs that emerged 3 mya).

Persistence hunting completely changed the game. Adaptations for running brought steep rewards. In a typical persistence hunt, the hunter averages an energy expenditure of 850 Kcal; they energy gains from big game is multiple times larger. Compare the calorie budget for a modern-day hunter-gatherer with that of chimps: in our prime, we produce twice as many calories as we consume!

Born To Run_ Calorie Budget (1)

Life is fundamentally about getting energy to make more life.

Persistence hunting was the turning-point in human evolution. Our species began winning, in terms of our reliably acquiring surplus energy. This surplus was the reason why our lineage could “afford” bigger brains, taller bodies, more frequent births, and longer childhood. All of these characteristics have improved gradually & continuously since the erects emerged.

Our Cursorial Adaptations

We have looked at the reasons behind our running. What does anatomy tell us?

First, let’s compare the physics of walking vs running:

  • Walking is an inverted pendulum mechanism.  Our feet and our hips alternate as the center of rotation.
  • Running is a mass-spring mechanism. Ligaments transfer foot-strike kinetic energy into tendons, which is released as we bounce onward.

Walking doesn’t require springs – but running does. And the bodies of erects have two new ligaments that serve precisely this purpose:

  • The Achilles’ tendon stores and releases 35% of energy expended while running (but not walking). In chimps, this tendon is 1cm long. In erects, it is 10cm and much thicker.
  • The dome-shaped arch of the foot is another spring, which lowers the cost of running by 17%.

During bipedal running the risk of falling and sprained ankles is high, which in the ancestral environment had adaptive consequences. Thus, the human body also developed many stabilization techniques:

  • Gluteus maximus. Barely active during walking, this muscle contracts forcefully during running to prevent the trunk from toppling forward. 
  • Various head stabilization devices. Promotes vision continuity and protects the brain (watch a runner with a ponytail sometime).
  • Enlarged semicircular canals (balance organs) in inner ear, which can be seen by measuring certain dimensions of fossilized skulls.

I have listed five features of our anatomy that relate to endurance running. Lieberman et al (2006) list twenty:

Born To Run_ Anatomical Comparison (1).png

As you can see, not all of these running adaptations emerged with Homo Erectus. Homo Habilis already shows adaptations for running. It would not surprise me in the slightest if that species also saw the beginnings of our skin trajectory.

Adaptations for running came at a price. We have lost our ability to climb trees. We are the first primate to lose this ability.

Takeaways

Why do humans look so different from our closest living relative, the chimpanzee?

Why do we have scent glands in our armpits? Fat in our asses? Such weird hair? Hairless skin with massive subcutaneous fat deposits?

Animal body plans are designed to excel in a particular niche. Our bodies are designed for persistence hunting. Compared to other primates, our anatomies optimize for thermoregulation, efficient energy transfer, and stabilization during running.

Born To Run_ Overview (5)

Chimpanzees don’t need to exercise to stay fit. We do. Our health sees dramatic benefits from aerobic exercise, especially running.

References

  • Bramble & Lieberman (2004). Endurance running and the evolution of Homo
  • Lieberman et al (2006). The human gluteus maximus and its role in running
  • Lu et al (2016). Spatiotemporal antagonism in mesenchymal-epithelial signaling in sweat versus hair fate decision.
  • Rogers et al (2004). Genetic Variation at the MCiR Locus and the Time since Loss of Human Body Hair

Cooking and the Hominin Revolution

Part Of: Anthropogeny sequence
See Also: Born to Run: a theory of human anatomy
Content Summary: 2100 words, 21 min read

The Universality of Cooking

Cooking is a human universal. It has been practiced in every known human society. Rumors to the contrary have never been substantiated. Not only is the existence of cooked foods universal, but most cuisines feature cooked foods as the dominant source of nutrition.

Cooking_ A Human Universal (1)

Raw foodists comprise a community dedicated to consuming uncooked food. Of course, compared to historical hunter-gatherers, modern raw foodists enjoy a wide variety of advantages. These include:

  1. Elaborate food preparation (pounding, purees, gently warming),
  2. Elimination of seasonal shortages (supermarkets)
  3. Genetically enhanced vegetables with more sugar content and fewer toxins.

Despite these advantages, raw foodists report significant weight loss (much more than vegetarians!). Further, raw foodists suffer from increasingly severe reproductive impairments, which have been linked to not getting enough energy.  

Cooking_ Consequences of Raw-Foodism (1)

Low BMI and impaired reproduction are perhaps manageable in modern times, but are simply unacceptable to hunter-gatherers living at subsistence levels.

The implication is clear: there is something odd about us. We are not like other animals. In most circumstances, we need cooked food.

The Energetics of Cooking

Life exists to find energy in order to make more copies of itself. Feeding and reproduction are the twin genetic imperatives.

Preferences are subject to natural selection. The fact that we enjoy cooked food suggests that cooking provides an energy boost to its recipients. The raw-foodist evidence hints towards this conclusion as well. But there is also direct evidence in rats that cooking increases energy gains.

In the following experiments, rat food was either processed/pounded, cooked, neither, or both. After giving this diet over the course of four days, rats in each condition were weighed.

Cooking_ Energy Benefits of Cooking (1).png

For starches (left) and meat (right), cooking is by far more effective at preventing weight loss and promoting weight gain. Tenderizing food can sometimes help, but that technique pales in comparison to cooking.  

The above results were taken from rats. But similar results have replicated in calves, lambs, piglets, cows, and even salmon. It seems to be universally true that cooking improves the energy derived from digestion, sometimes up to 30%.

How does cooking unlock more energy for digestion?

First, denaturation occurs when the internal bonds of a protein weaken, causing the molecule to open up. Heat predictably denatures (“unfolds”) proteins, and denatured proteins are more digestible because their open structure exposes them to the action of digestive enzymes.

Besides heat, three other techniques promote denaturation: acidity, sodium chloride, and drying. Cooking experts constantly harp on these exact techniques, because it aligns with eating preferences.

Second, tender foods is another boon to digestion, because they offer less resistance to the work of stomach acid.  If you take rat food, and inject air into the pellets, that does not augment denaturation. Nevertheless, softening food in this way improves the energy intake of the rat.

Cooking does have negative effects. It can cause a loss of vitamins, and give rise to long-term toxic molecules called Maillard compounds, which are linked to cancer. But from an evolutionary perspective, these downsides are overshadowed by the impact of more calories. In subsistence cultures, better fed mothers have more, happier, and healthier children. When our ancestors first obtained extra calories by cooking their food, they and their descendants past on more genes than others of their species who ate raw.

A Brief Review of Human Evolution

The most recent common ancestor of humans and chimpanzees lived 6 mya (million years ago). But the first three million years of our heritage are not particularly innovative, anatomically. The australopiths were essentially bipedal apes. They could walk comfortably, but retained their adaptations for tree living as well. There is some evidence that australopiths acquired food from a new source: tubers (the underground energy storage system of plants).

Climate change is responsible for the demise of the australopiths. Africa began getting dryer about 3 million years ago, making the woodlands a harsher and less productive place to live. Desertification would have reduced the wetlands where Australopiths found fruits, seeds, and underwater roots. The descendents of Australopithecus had to adapt their diet.

The paranthropes adapted by promoting tubers (underground storage organs of plants) from backup to primary food. In contrast, the habilines (e.g., Homo Habilis) took a different strategy: meat eating. These creatures inherited tool making from the late australopiths (Mode 1 tools, the Oldawan industry- was discovered in Ethiopia 2.6 mya), and used these tools to scrape meat off of bones). The habilines are more ecologically successful, and lead to:

  • 1.9 mya: The erects (e.g., Homo erectus/ergastor) with significantly larger brains and near-modern anatomies.
  • 0.7 mya: The archaics (e.g., Homo Heidelbergensis) appear, who eventually give rise to the Neanderthals, Denisovans, and us.
  • 0.3 mya The moderns (e.g., Homo Sapiens) emerge out of Africa, and completely conquer the globe.

Here is a sketch of how our body plans have changed across evolutionary time:

Cooking_ Hominin Anatomy Comparison

Explaining Hominization

The transition from habiline to erects deserves a closer look. We know erects evolved to be persistence hunters. But a number of paradoxes shroud their emergence:

  1. Digestive Apparati. The erect diet appears to be mainly meat and tubers. Both require substantial jaw strength and digestive apparati. Yet the Homo genus features a dramatically reduced digestive apparatus. How was smaller mouths, weaker jaws, smaller teeth, small stomachs, and shorter colons an adaptive response to eating meat and starches?
  2. Expensive Tissue. Australopiths brain size stayed relatively constant at 400 ccs (10% of resting metabolism). Erect brains began to grow. This transition ultimately yielded a 1400 cc brain (20% of resting metabolism) in archaic humans. How did the erects find the calories to finance this expansion?
  3. Time Budget. The above anatomical features of erects are geared towards endurance running, which suggests that their lifestyle involved persistence hunting. Chimps have about 20 minute intervals in between searching for & chewing food. Thus, chimps can only afford to spend 20 minutes hunting before giving up. How did erects perform the risky behavior of persistence hunting, which consumes 3-8 hours of time?
  4. Thermal Vulnerability. As part of their new hunting capabilities, erects became the naked ape (with a new eccrine sweat gland system to prevent overheating). But Homo Erectus also managed to migrate to non-African climates such as Europe. How did these creatures stay warm?
  5. Predator Safety. Erects lost their anatomical features for arboreal living, which suggests they slept on the ground. Terrestrial sleeping is quite dangerous on the African savannah. How did erects avoid predation & extinction?

All of these confusing phenomena can be explained if we posit H. erectus discovered the use of fire, and its application in cooking:

  1. Digestive Apparati. As we have seen, the primary role of cooking is to “externalize digestion”, and to increase the efficiency of our digestive tract. Cooked meat and starches are incredibly less demanding to process than their raw alternatives. This explains our reduced guts. By some estimates, the decrease in digestive tissue corresponds with a 10% energy savings by our erect ancestors.
  2. Expensive Tissue. Cooking increases the metabolic yield of most foodstuffs by ~30%. For reference, a 5% increase in ripe fruit for chimpanzees reduces interbreeding interval (time between children) by four months. 30% is an absurdly large energy gain, enough to “change the game” for the erects..
  3. Time Budget. Cooking freed up massive amounts of time otherwise spent chewing. Chimpanzees can take 4-7 hours per day chewing; humans only need one hour per day. This frees up massive amounts of time, which can be used for e.g., hunting.
  4. Thermal Vulnerability. It is very difficult to explain a hairless Homo Erectus thriving on the colder Asian continent without control of fire.
  5. Predator Safety. It is very difficult to explain how erects were not preyed upon to extinction without fire to identify & deter predators. Hadza hunter-gatherers comfortably sleep through the night, typically by taking turns “on watch” while the others rest.

Cooking_ Overall Argument (3)

The Archaeological Record

We are positing that erects learned to create and controlling fire 2 mya. Is that a feasible hypothesis?

Habilines had learned how to create stone tools 2.6 million years ago. By the time of the erects, techniques to create these tools had persisted for 600,000 years. So it is safe to say that our ancestors were able to retain useful cultural innovations.

Independent environmental reasons link fire-making with H Erectus. The Atlas mountain range is the most likely birthplace of this species, and this dry area fires triggered by lightning are an annual hazard. Hominins living in such environments would be more intimately familiar with fire than those with less combustible vegetation zones.

Erects would have seen sparks when they hit stones together to make tools. But the sparks produced by many kinds of rock are too cool to catch fire. However, when pyrites (a fairly common ore) are hit against flint, the results are used by hunter-gatherers to reliably produce fire. The Atlas mountain range is renowned for being exceptionally rich in minerals:

Why is Morocco one of the world’s great countries for minerals? No glaciers! Many of the world’s most colorful minerals are found in deposits at the surface, formed over time by the interaction of water, air and rock. Glaciers remove all of that good stuff (as happened in Canada recently, geologically speaking) –  and with no recent glaciation, Morocco hosts many fantastic occurrences of minerals unlike any in parts of the world stripped bare during the last Ice Age.

Since this mountain range contains pyrites, early erects could have found themselves inadvertently making fire rather often.

Once it is created, fire is relatively easy to keep going. And it does not take much creativity to stick food a fire. Moreover, modern-day chimps prefer cooked food over raw; it is hard to imagine H Erectus finding cooked food distasteful. All of these considerations suggest an early control of fire is at least plausible.

We can consult the archaeological record to see record of man-made fire (i.e., hearths). This is bad news for the cooking hypothesis! There is strong evidence for hearths dating back to 800 mya and the advent of archaic humans. Before then, there are six sites that seem to be hearths; but these are not universally acknowledged as such.

Cooking_ Archaeology Evidence (1)

But absence of evidence isn’t evidence of absence, right?

No! That idiom is wrong. Silence is evidence of absence. It’s just that the strength of the evidence depends on the nature of the hypothesized entity.

  • If you think an unidentified planet orbits the Sun, a lack of evidence would weigh heavily against the hypothesis.
  • If you think an unidentified pebble orbits the Sun, a lack of evidence doesn’t say much one way of the other.

Wrangham argues that evidence of hearths are more fragile than e.g. fossils, and points to facts like there are zero hearths recorded for modern humans during European “ice ages” – but we know these must have existed. It is possible that the contested hearth sites will ultimately be vindicated, and that we just can’t see much evidence.

Despite these claims about evidential likelihood, the silence of the archaeological record is undeniably a significant objection to the theory.

Weighing The Evidence

Is the cooking hypothesis true? Let us weigh the evidence, and contrast it with alternative hypotheses.

The most plausible alternative hypothesis is that archaic humans H. Heidelbergensis discovered cooking. But the emergence of that species involved an increase in brain size, and more sophisticated culture & hunting technology.  Neither adaptation seems strongly connected to cooking. In contrast, the H. Erectus adaptations would have all been strongly affected by cooking. 

Moreover, alternative hypotheses must still answer the five paradoxes of hominization:

  1. Digestive Apparati. Why did erects evolve smaller mouths, weaker jaws, smaller teeth, small stomachs, and shorter colons?
  2. Expensive Tissue. How did the erects find the calories to finance more brain tissue?
  3. Time Budget. How could erects afford spending 3-8 hours per day engaged in the risky strategy hunting?
  4. Thermal Vulnerability. Erects also managed to migrate to non-African climates such as Europe. How did these creatures stay warm?
  5. Predator Safety. Erects slept on the ground. How did they avoid predation?

The habilines ate meat. It is unclear how they did so (hunting or scavenging), but we have strong evidence that they did. Meat is a much higher quality food than tubers (cf. paranthropes) or fruit (cf. chimpanzees). The meat-eating hypothesis argues that meat eating was the primary driver of hominization.

Meat-eating resolves the Expensive Tissue paradox (meat allows for brain growth) and Digestive Apparati (carnivores are known to have smaller guts). But it doesn’t address why a meat-eater would develop smaller canines. And it struggles to explain how the reduction in gut size is compatible with the tuber component of the erect diet. And what about time budget, thermal vulnerability, and predator safety? The meat eating hypothesis fails to address these paradoxes entirely.

Which is more likely to occur in the next twenty years: undisputed evidence for early control of fire, or an alternate theory that resolves all five hominization paradoxes?

My money is on the former.

References

  • Wrangham (). Catching Fire: How Cooking Made Us Human
  • Aiello & Wheeler(1995). The expensive tissue hypothesis: the brain and the digestive system in primate and human evolution.

Moral Foundations Theory

Part Of: Demystifying Ethics sequence
Content Summary: 1700 words, 17 min read

The contents of our social intuitions is not arbitrary. They are not entirely plastic to changes in environment. Rather, the brain are built with innate social intuition generators, which bias the content of social judgments.

Generator 1: Care/Harm

Parents care for their children. This imperative of natural selection is directly expressed in caregiving mechanisms in the brain. While the proper domain of caregiving is one’s kin, other modules (such as the mammalian attachment module) can elicit caregiving behaviors towards non-kin.

For primates living in close proximity, male violence is an increasingly noxious threat. Accordingly, Cushman et al (2012) show evidence for a violence aversion device, which triggers a strong autonomic reaction to actions of violence committed by oneself (but not others). Here is an example of their experimental apparatus: underneath the X is a fake leg. Even though they knew the action was harmless, delivering the blow caused significant visceral distress, compared to watching it being done by someone else. moral foundations_ violence aversion (1)

The violence aversion device is sensitive to calculations of personal force which is used to generate feelings of agency in the brain. The alarm only triggers when our body directly delivers force onto another person. This explains why the alarm triggers in the footbridge dilemma (“push the fat man to save five lives”) but not the trolley problem (“flip a switch to kill one and save five”).

Generator 2: Proportional Fairness

Main Article: Evolutionary Game Theory

When interacting with other organisms, one can act purely selfishly or cooperatively. The Prisoner’s Dilemma illustrates that acting in one’s self-interest can lead to situations where everyone loses. There is strong evolutionary pressure to discover cooperative emotions: devices that avert the tragedy of the commons.

The Iterated Prisoner’s Dilemma (IPD) makes game theory more social, where many players compete for resources multiple times. While one-off PD games favor selfish behavior, IPD can favor strategies that feature reciprocal altruism, such as Tit-for-Tat. More generally, IPD strategies do best if they are nice, retaliating, and forgiving.

Social equality is a special case of proportionality: when contributions are equal, so too should rewards. But when contributions are unequal, most adults affirm reward inequality. We have a deep intuitive sense of karma: what people deserve depends on how much effort they expend.

Generator 3: Dominance

Main Article: An Introduction to Primate Societies

When animals’ territory overlaps, they often compete for access to resources (e.g., food and reproductive access).

Fighting is accompanied with risk: the stronger animal could be unlucky, the weaker animal could lose their life. Similar to human warfare, both sides suffer less when the weaker side preemptively surrenders. The ability to objectively predict the outcome of a fight is therefore advantageous.

Suppose the need for fight-predictions is frequent, and do not often change (physical strength changes only slowly over an animal’s life). Instead of constantly assessing physical characteristics of your opponent, it is simpler to just remember who you thought was stronger last time.

This is the origin of the dominance hierarchy. The bread and butter of dominance hierarchies is status signaling. Dominant behaviors (e.g., snarling) evokes submissive behaviors (e.g., looking away).

Generator 4: Autonomy

Consider the following facts.

  1. The earliest groups of humans seem to have been governed by an egalitarian ethic, much as surviving communities of nomadic hunters and gatherers still are.
  2. That ethic is unique among other species of great apes that are our closest cousins. Most notably, chimps and gorillas live in bands led by despotic alpha males.
  3. As human societies developed settled agriculture and then civilization, despotism and hierarchy reemerge.

How can we explain these things? Perhaps a new emotional system evolved: autonomy. It motivated groups of non-dominant humans to form coalitions against any potential alpha despot. This trend is born out in the data: about half of all murders cross-culturally have an anti-bullying motive. But murder is not the only sanctioning device, followers also use techniques such as criticism, ridicule, disobedience, deposition, and desertion (Boehm, 2012).

Our species never lost its capacity for despotism. But in the human inverted hierarchy, our species discovered a newfound will to tear down authority figures, which created within us a capacity for egalitarianism. These two systems (Autonomy and Dominance) live in tension with one another, and one can “gain the upper hand” by changes in the broader cultural milieu (cf., agriculture and the collapse of egalitarian societies).

Generator 5: Purity / Disgust

Main Article: The Evolution of Disgust

The human brain comes equipped with two systems:

  1. Poison monitoring is a faculty of the digestive system. It evolved to regulate food intake and protect the gut against harmful substances.
  2. Infection avoidance is a faculty of the immune system. It evolved to protect against infection from pathogens and parasites, by avoiding them.

In humans, these two systems were entangled in the emotion of disgust. This explains the otherwise baffling diversity of disgust elicitors & behaviors.

Disgust motivated the creation of food taboos (e.g., don’t eat pork) and purity laws (e.g., don’t put your feet on the table).

Generator 6: Group Loyalty

Two people can put Us ahead of Me by belonging to a cooperative group, provided that group members can reliably identify one another. Specifically, we possess a group membership device which uses symbols to delineate different factions. Members of the ingroup are treated warmly (ethnocentrism); members of the outgroup are treated poorly (xenophobia). We even pay more attention to members of the ingroup, leading to such phenomena as outgroup homogeneity (c.f., evangelical Christians describing non-evangelicals as “the world”).

Ethnic psychology describes modules in our brain responsible for constructing groups. We are particularly interested in constructing stereotypes of other groups. Our brains already come equipped with folk biology modules that delineate different species of flowers, for example. Gilwhite et al (2001) adduce evidence that ethnic groups are treated as biological “species” in the human brain.

The Right Kind of List

We’ve discussed six intuition generators: care/harm, proportional fairness, dominance, autonomy, purity/disgust, and group loyalty.  

Is our list too long? So many mechanisms to explain human social behavior would seem to violate parsimony. Are we adorning our theory with epicycles? Are we overfitting our model?

In fact, I affirm the massive modularity hypothesis: that the human brain contains dozens of mental modules, each of which have distinctive phylogeny, ontogeny, anatomy, behavioral profile, and ecological motivation. I have not conjured these entities to explain morality. Rather, I am drawing a small subset from my overarching project to describe the architecture of mind.

Implications for the Norm System

Recall the the moral/conventional distinction:

  • Conventional judgments (should / should not) are intuitions of socially appropriate behavior, and associated with embarrassment.
  • Moral judgments (good / evil) are also judgments about behavior, but more associated with anger, inflexibility, condemnation, and guilt.

Jonathan Haidt claims that these generators are responsible for moral intuitions. But the above generators also underlie the structure of our conventional norms. After all, there are plenty of mildly disrespectful behaviors that even the most conservative people would not describe as evil.

We have identified dozens of other specialized modules in the human brain. Why is e.g.,  feeling of knowing (recognition memory) not on our list? Because there were no biocultural pressures to integrate it with the norm acquisition and norm evaluation systems. We call our six modules social intuition generators because they have become intertwined with our normative machinery.

moral foundations_ module view

An Explanation of American Politics

People are genetically and environmentally disposed to respond to certain generators more strongly than others. Social matrices encode how many stimuli activate a given social intuition, and how strongly. 

People with similar matrices tend to gravitate towards similar political parties. When you measure the social matrices of American citizens, you can see large differences between the social intuitions of Democrats and Republicans (Graham et al, 2009).

moral foundations_ social matrices by political party (2)

These differences in social matrices explain much of American politics.

  • Why do Democrats praise entitlements, but Republicans denounce them? Because Democrats heavily emphasize Care for the poor, whereas Republicans more strongly reverberate to questions of Proportional Fairness (moral hazard).
  • Why are Democrats more skeptical of patriotism than their Republican counterparts? Perhaps because they respond to Loyalty to country less.
  • How can both groups claim to value Proportional Fairness? There are two competing explanations for poor outcomes: environmental (bad luck) or personal (poor character). Liberals tend to focus on the former, conservatives on the latter.
  • How can both groups claim to value Autonomy? For liberals, Autonomy responds ethnic oppression: perceived injustices done in the name of one’s tribe. The foundation is expressed as group symmetry. For conservatives, Autonomy responds to government oppression: perceived injustices in the form of taxes, nanny state, and regulations. The foundation is expressed as political liberty.

Looking Forward

Moral Foundations Theory is the invention of Jonathan Haidt, who introduces the concept in his excellent 2012 book The Righteous Mind: Why Good People are Divided by Politics and Religion. You can explore your moral matrix at yourmorals.org.

This post is 90% exposition, and 10% innovation. I innovate in the preceding two sections, by a) linking the six “taste buds” to mental modules that modulate inputs to the normative system, and b) broadening its reach to conventional (non-moral) norms.

In his book, Haidt makes the case the conservatives are more ethically sophisticated, because their moral judgments respond to a larger number of taste buds. But besides appealing to the ethos of Durkheim and Burke, Haidt doesn’t investigate the normative status of the social intuition generators in sufficient detail.

Here are three questions I would like to explore, at some point:

  • What is the normative status of e.g., disgust? If we could dampen or amplify disgust reactions in human beings, what would be the end result?
  • Social matrices encode different modes of existence that are hard to comprehend unless they are lived. What sort of social matrices are underexplored? Does there exist entirely novel modes of existence that we simply have not yet tried out?
  • What does the moral matrix of a successful metamorality look like? How do we promote positive outcomes when moral communities must live with one another?

Related Resources

  • Boehm (2012). Hierarchy in the Forest: The Evolution of Egalitarian Behavior
  • Haidt (2012). The Righteous Mind: Why Good People are Divided by Politics and Religion.
  • Graham et al (2009). Liberals and conservatives rely on different sets of moral foundations.
  • Cushman et al (2012). Simulating murder: the aversion to harmful action
  • GilWhite et al (2001). Are ethnic groups biological “species” to the human brain? Essentialism in our cognition of some social categories

The Evolution of Disgust

Part Of: Affective Neuroscience sequence
Content Summary: 1400 words, 14 min read.

Introduction

Why did disgust evolve? Why does it play a role in morality? Should it?

One of the best ways to understand an emotion is to build a behavioral profile: a list of its responses (outputs) and elicitors (inputs).

Disgust Responses

One of the striking features of disgust is how diverse its set of responses. These include an affect program:

  • Gape face. This is characterized by a nose wrinkle, extension of the tongue, and wrinkle upper brow.
  • Feeling of nausea. In fact, the physiological signature of intense disgust closely matches physical nausea.
  • A withdrawal reflex. This reflex need not be physical retreat, but can also yield motivation to remove the offending object.

But disgust also produces an inferential signature:

  • Sense of oral incorporation. That is, the subjective feeling that the offending object is already in one’s mouth.
  • Offensiveness tagging. Even after the object has been removed, it will continue to be treated as offensive indefinitely.
  • Asymmetric transmission logic. See the law of contagion: a clean object that touches something gross is contaminated, but not vice versa.

Disgust Elicitors

Even more diverse than its outputs, the elicitors of disgust include cultural universals, including:

  • Organic decay.
  • People and objects associated with illness
  • Compromised body envelope. These include: cuts, gashes, lesions, or open sores.
  • Substances that have left the body. These include feces, vomit, spit.  

Swallowing the saliva that is currently in your mouth is innocuous, but even imagining yourself drinking a glass of spit (even if it is (was?) your own, is disgusting. These last two elicitors are body perimeter tracking: they not only police the boundaries of the body in peripersonal space, but also seem to enforce a no re-entry policy: anything that exits or becomes detached triggers it.

There exists another suite of elicitors that are culturally tuned

  • Specific foods.  Some foods are deemed disgusting even when they have never been tried (e.g., liver).
  • Specific living animals. These can include: flies, maggots, worms, rates, lice, tics, slugs, snails, and spiders…
  • Specific sexual practices. These can include: homosexuality, pedophilia, bestiality, necrophilia, …
  • Specific morphological signatures. Deviations from bodily normality, however that is construed in a particular culture. These can include: the elderly, disabled, little people, …

It is worth emphasizes that disgust over sexual practices and morphological signatures varies widely across cultures and across individuals. For example, ancient Greece mostly didn’t find homosexuality disgusting but 20th century Americana mostly did.

Finally, people comprise another category of elicitors.

  • Moral transgressors. These can include: murderers, rapists, …
  • Members of an out-group. These can include: untouchable caste, Jews (in Nazi Germany), …

Neuroscientific data suggest that, when people are deemed sufficiently disgusting, brain areas associated with mindreading become deactivated. This is likely the neural basis of dehumanization.

The Entanglement Thesis

Taken together, here is the behavioral profile of disgust:

disgust_ behavioral profile

Puzzle: Why should the sight of a person with leprosy evoke a gape face and a feeling of nausea? Leprosy has nothing to do with digestion.

Solution: Disgust is a kludge! It is the unholy merger of two separate systems.

Poison monitoring is a faculty of the digestive system. It evolved to regulate food intake and protect the gut against ingested substances that are poisonous or otherwise harmful. It was designed to expel substances entering the gastrointestinal system via the mouth. It also acquires new elicitors very quickly.

Infection avoidance is a faculty of the immune system. It evolved to protect against infection from pathogens and parasites, by avoiding them. Not specific to ingestion, but serves to guard against coming into close physical proximity with infectious agents. This involves avoiding not only visible pathogens and parasites, but also places, substances and other organisms that might be harboring them.

Any theory of disgust should explain the unity of responses to disgust. Here is how entanglement theory does it:

  • Poison monitoring produces the affect program. Gape face, nausea and withdrawal all serve digestive (and not immunological) purposes.
  • Infection avoidance produces (most of) the inferential signature. The tendency to monitor disgusting things even when not immediately exposed, and the asymmetric logic of contamination, make perfect sense when tracking the spread of parasites.

Any theory of disgust should explain the diversity of elicitors of disgust. Here is how entanglement theory does it:

  • Poison monitoring is sensitive to certain foods (namely, those that are associated with toxicity)
  • Infection avoidance explains the aversion to certain living animals (flies are more likely to carry disease than dogs), apparently disease-infected substances, to certain sexual practices (sexual practices can bring increased risk of disease) and morphological deviations (e.g., violates of facial symmetry correlate with parasites). It also explains the general tendency for disgust to monitor the body perimeter: which is, after all, how pathogens can enter the body!

Any theory of disgust should explain cultural variation of the elicitors. Here is how entanglement theory does it:

  • The poison monitoring system is very quick to learn features the Garcia effect: one-shot learning.
  • In women, aversion to deviant sexual practices (and not other forms of disgust) vary with where they are in the ovulation cycle.

disgust_ entanglement thesis

Besides the increase in explanatory power, phylogenetic and ontogenic data also support the independence of these two systems:

  • Researchers disagree whether disgust is unique to humans, or whether homologies exist in the animal kingdom. Both are right: animals show clear signs of the existence of both systems but the systems are expressed separately.
  • Ever wonder why children don’t seem to mind disgusting objects & behaviors? It is because poison monitoring appear very early (within first year of life) but infection avoidance emerges significantly later.

The Evolution of Disgust

Why should the poison avoidance and pathogen monitoring have become entangled in the course of human evolution? Why didn’t poison avoidance become entangled with e.g., FEAR instead?

First, the two systems both care about digestion. Food intake can bring both poison and pathogens into the body, and as such it is monitored by both systems.

Why did entanglement only happen in humans, specifically? Compared to other primates, early hominids adopted a unique lifestyle, that combined scavenging with a nascent ultrasociality. These two characteristics put enormous adaptive pressure on the pathogen avoidance system to innovate.

Perhaps the most important reason for entanglement has to do with signaling. As hominids began to increasingly emphasize social cooperation, there became a need to communicate pathogenic information. Before the emergence of language, the pathogen avoidance module had an inferential signature – but how to communicate this contamination tagging information with others? The functionally-overlapping toxin monitoring system had a clearly visible output: the gape face. Plausibly, the two modules merged such that pathogen monitoring system could co-opt gape face to communicate. We can call this the gape face as signal theory.

My Take on the Theory

The theory I have presented here was developed by Daniel Kelly’s book Yuck! The Nature and Moral Significance of Disgust. The theory strongly complements Mark Schaller’s work on the behavioral immunity system. The overlap between these two researchers will become clear next time, when we turn to the social co-optation of the disgust system.

I personally find the entanglement thesis (the merger of toxin monitoring and pathogen avoidance systems) compelling, given its tremendous explanatory power outline above.

Despite accepting the overall architecture, Kelly’s theory for why the architecture evolved (gape face as signal) strikes me as incomplete.

I also feel like this theory will remain incomplete until we discover how toxin monitoring and parasite avoidance are implemented in dissociable neurobiological structures (i.e., modules).

After the psychological mechanisms are mapped to their physical roots, we could attempt to integrate our knowledge of disgust with other systems:

  • What is the relationship of disgust to the generalized stress response? Stress & the immune systems co-evolved to share the HPA axis, after all.
  • How is disgust implemented in the microbiome-gut-brain axis, which also has links to both the digestive system (enteric nervous system) and the immune system (e.g., leaky gut)?
  • How does the MGB axis differentially produce both disgust and other social phenomena like anxiety?

Open questions are exciting! To me, it suggests a clear research program where we can start integrating our newfound theory of disgust into the broader picture of visceral processes (the hot loop).

Takeaways

The human brain comes equipped with two systems:

  1. Poison monitoring is a faculty of the digestive system. It evolved to regulate food intake and protect the gut against harmful substances.
  2. Infection avoidance is a faculty of the immune system. It evolved to protect against infection from pathogens and parasites, by avoiding them. 

In humans, these two systems were entangled in the emotion of disgust. This explains the otherwise baffling diversity of disgust elicitors & behaviors.

Related Resources

  • Kelly (2013). Yuck! The Nature and Moral Significance of Disgust.
  • Fessler & Haley (2006). Guarding the Perimeter: the inside-outside dichotomy in disgust and bodily experience.