Mann, Charles C. - The State of The Species
Mann, Charles C. - The State of The Species
Mann, Charles C. - The State of The Species
something to do with biological reality. A researcher who specialized in cells and microorganisms,
Margulis was one of the most important biologists in the last half century—she literally helped to
reorder the tree of life, convincing her colleagues that it did not consist of two kingdoms (plants and
animals), but five or even six (plants, animals, fungi, protists, and two types of bacteria).
Until Margulis’s death last year, she lived in my town, and I would bump into her on the street from time
to time. She knew I was interested in ecology, and she liked to needle me. Hey, Charles, she would call
out, are you still all worked up about protecting endangered species?
Margulis was no apologist for unthinking destruction. Still, she couldn’t help regarding conservationists’
preoccupation with the fate of birds, mammals, and plants as evidence of their ignorance about the
greatest source of evolutionary creativity: the microworld of bacteria, fungi, and protists. More than 90
percent of the living matter on earth consists of microorganisms and viruses, she liked to point out. Heck,
the number of bacterial cells in our body is ten times more than the number of human cells!
Bacteria and protists can do things undreamed of by clumsy mammals like us: form giant supercolonies,
reproduce either asexually or by swapping genes with others, routinely incorporate DNA from entirely
unrelated species, merge into symbiotic beings—the list is as endless as it is amazing. Microorganisms
have changed the face of the earth, crumbling stone and even giving rise to the oxygen we breathe.
Compared to this power and diversity, Margulis liked to tell me, pandas and polar bears were biological
epiphenomena—interesting and fun, perhaps, but not actually significant.
Does that apply to human beings, too? I once asked her, feeling like someone whining to Copernicus
about why he couldn’t move the earth a little closer to the center of the universe. Aren’t we special at
all?
This was just chitchat on the street, so I didn’t write anything down. But as I recall it, she answered that
Homo sapiens actually might be interesting—for a mammal, anyway. For one thing, she said, we’re
unusually successful.
Seeing my face brighten, she added: Of course, the fate of every successful species is to wipe itself out.
One way to begin answering them came to Mark Stoneking in 1999, when he received a notice from his
son’s school warning of a potential lice outbreak in the classroom. Stoneking is a researcher at the Max
Planck Institute for Evolutionary Biology in Leipzig, Germany. He didn’t know much about lice. As a
biologist, it was natural for him to noodle around for information about them. The most common louse
found on human bodies, he discovered, is Pediculus humanus. P. humanus has two subspecies: P.
humanus capitis—head lice, which feed and live on the scalp—and P. humanus corporis—body lice,
which feed on skin but live in clothing. In fact, Stoneking learned, body lice are so dependent on the
protection of clothing that they cannot survive more than a few hours away from it.
It occurred to him that the two louse subspecies could be used as an evolutionary probe. P. humanus
capitis, the head louse, could be an ancient annoyance, because human beings have always had hair for
it to infest. But P. humanus corporis, the body louse, must not be especially old, because its need for
clothing meant that it could not have existed while humans went naked. Humankind’s great coverup had
created a new ecological niche, and some head lice had rushed to fill it. Evolution then worked its magic;
a new subspecies, P. humanus corporis, arose. Stoneking couldn’t be sure that this scenario had taken
place, though it seemed likely. But if his idea were correct, discovering when the body louse diverged
from the head louse would provide a rough date for when people first invented and wore clothing.
The subject was anything but frivolous: donning a garment is a complicated act. Clothing has practical
uses—warming the body in cold places, shielding it from the sun in hot places—but it also transforms
the appearance of the wearer, something that has proven to be of inescapable interest to Homo sapiens.
Clothing is ornament and emblem; it separates human beings from their earlier, un-self-conscious state.
(Animals run, swim, and fly without clothing, but only people can be naked.) The invention of clothing
was a sign that a mental shift had occurred. The human world had become a realm of complex, symbolic
artifacts.
With two colleagues, Stoneking measured the difference between snippets of DNA in the two louse
subspecies. Because DNA is thought to pick up small, random mutations at a roughly constant rate,
scientists use the number of differences between two populations to tell how long ago they diverged
from a common ancestor—the greater the number of differences, the longer the separation. In this case,
the body louse had separated from the head louse about 70,000 years ago. Which meant, Stoneking
hypothesized, that clothing also dated from about 70,000 years ago.
She couldn’t help regarding conservationists’ preoccupation with the fate of birds, mammals, and
plants as evidence of their ignorance about the greatest source of evolutionary creativity: the
microworld of bacteria, fungi, and protists.
And not just clothing. As scientists have established, a host of remarkable things occurred to our species
at about that time. It marked a dividing line in our history, one that made us who we are, and pointed us,
for better and worse, toward the world we now have created for ourselves.
Homo sapiens emerged on the planet about 200,000 years ago, researchers believe. From the beginning,
our species looked much as it does today. If some of those long-ago people walked by us on the street
now, we would think they looked and acted somewhat oddly, but not that they weren’t people. But
those anatomically modern humans were not, as anthropologists say, behaviorally modern. Those first
people had no language, no clothing, no art, no religion, nothing but the simplest, unspecialized tools.
They were little more advanced, technologically speaking, than their predecessors—or, for that matter,
modern chimpanzees. (The big exception was fire, but that was first controlled by Homo erectus, one of
our ancestors, a million years ago or more.) Our species had so little capacity for innovation that
archaeologists have found almost no evidence of cultural or social change during our first 100,000 years
of existence. Equally important, for almost all that time these early humans were confined to a single,
small area in the hot, dry savanna of East Africa (and possibly a second, still smaller area in southern
Africa).
But now jump forward 50,000 years. East Africa looks much the same. So do the humans in it—but
suddenly they are drawing and carving images, weaving ropes and baskets, shaping and wielding
specialized tools, burying the dead in formal ceremonies, and perhaps worshipping supernatural beings.
They are wearing clothes—lice-filled clothes, to be sure, but clothes nonetheless. Momentously, they are
using language. And they are dramatically increasing their range. Homo sapiens is exploding across the
planet.
What caused this remarkable change? By geologists’ standards, 50,000 years is an instant, a finger snap,
a rounding error. Nonetheless, most researchers believe that in that flicker of time, favorable mutations
swept through our species, transforming anatomically modern humans into behaviorally modern
humans. The idea is not absurd: in the last 400 years, dog breeders converted village dogs into creatures
that act as differently as foxhounds, border collies, and Labrador retrievers. Fifty millennia, researchers
say, is more than enough to make over a species.
Homo sapiens lacks claws, fangs, or exoskeletal plates. Rather, our unique survival skill is our ability to
innovate, which originates with our species’ singular brain—a three-pound universe of hyperconnected
neural tissue, constantly aswirl with schemes and notions. Hence every hypothesized cause for the
transformation of humankind from anatomically modern to behaviorally modern involves a physical
alteration of the wet gray matter within our skulls. One candidate explanation is that in this period
people developed hybrid mental abilities by interbreeding with Neanderthals. (Some Neanderthal genes
indeed appear to be in our genome, though nobody is yet certain of their function.) Another putative
cause is symbolic language—an invention that may have tapped latent creativity and aggressiveness in
our species. A third is that a mutation might have enabled our brains to alternate between spacing out
on imaginative chains of association and focusing our attention narrowly on the physical world around
us. The former, in this view, allows us to come up with creative new strategies to achieve a goal, whereas
the latter enables us to execute the concrete tactics required by those strategies.
Each of these ideas is fervently advocated by some researchers and fervently attacked by others. What is
clear is that something made over our species between 100,000 and 50,000 years ago—and right in the
middle of that period was Toba.
CHILDREN OF TOBA
About 75,000 years ago, a huge volcano exploded on the island of Sumatra. The biggest blast for several
million years, the eruption created Lake Toba, the world’s biggest crater lake, and ejected the equivalent
of as much as 3,000 cubic kilometers of rock, enough to cover the District of Columbia in a layer of
magma and ash that would reach to the stratosphere. A gigantic plume spread west, enveloping
southern Asia in tephra (rock, ash, and dust). Drifts in Pakistan and India reached as high as six meters.
Smaller tephra beds blanketed the Middle East and East Africa. Great rafts of pumice filled the sea and
drifted almost to Antarctica.
In the long run, the eruption raised Asian soil fertility. In the short term, it was catastrophic. Dust hid the
sun for as much as a decade, plunging the earth into a years-long winter accompanied by widespread
drought. A vegetation collapse was followed by a collapse in the species that depended on vegetation,
followed by a collapse in the species that depended on the species that depended on vegetation.
Temperatures may have remained colder than normal for a thousand years. Orangutans, tigers,
chimpanzees, cheetahs—all were pushed to the verge of extinction.
At about this time, many geneticists believe, Homo sapiens’ numbers shrank dramatically, perhaps to a
few thousand people—the size of a big urban high school. The clearest evidence of this bottleneck is also
its main legacy: humankind’s remarkable genetic uniformity. Countless people have viewed the
differences between races as worth killing for, but compared to other primates—even compared to most
other mammals—human beings are almost indistinguishable, genetically speaking. DNA is made from
exceedingly long chains of “bases.” Typically, about one out of every 2,000 of these “bases” differs
between one person and the next. The equivalent figure from two E. coli (human gut bacteria) might be
about one out of twenty. The bacteria in our intestines, that is, have a hundredfold more innate
variability than their hosts—evidence, researchers say, that our species is descended from a small group
of founders.
Uniformity is hardly the only effect of a bottleneck. When a species shrinks in number, mutations can
spread through the entire population with astonishing rapidity. Or genetic variants that may have already
been in existence—arrays of genes that confer better planning skills, for example—can suddenly become
more common, effectively reshaping the species within a few generations as once-unusual traits become
widespread.
Did Toba, as theorists like Richard Dawkins have argued, cause an evolutionary bottleneck that set off the
creation of behaviorally modern people, perhaps by helping previously rare genes—Neanderthal DNA or
an opportune mutation—spread through our species? Or did the volcanic blast simply clear away other
human species that had previously blocked H. sapiens’ expansion? Or was the volcano irrelevant to the
deeper story of human change?
Clothing is ornament and emblem; it separates human beings from their earlier, un-self-conscious state.
For now, the answers are the subject of careful back-and-forth in refereed journals and heated argument
in faculty lounges. All that is clear is that about the time of Toba, new, behaviorally modern people
charged so fast into the tephra that human footprints appeared in Australia within as few as 10,000
years, perhaps within 4,000 or 5,000. Stay-at-home Homo sapiens 1.0, a wallflower that would never
have interested Lynn Margulis, had been replaced by aggressively expansive Homo sapiens 2.0.
Something happened, for better and worse, and we were born.
One way to illustrate what this upgrade looked like is to consider Solenopsis invicta, the red imported fire
ant. Geneticists believe that S. invicta originated in northern Argentina, an area with many rivers and
frequent floods. The floods wipe out ant nests. Over the millennia, these small, furiously active creatures
have acquired the ability to respond to rising water by coalescing into huge, floating, pullulating balls—
workers on the outside, queen in the center—that drift to the edge of the flood. Once the waters recede,
colonies swarm back into previously flooded land so rapidly that S. invicta actually can use the
devastation to increase its range.
In the 1930s, Solenopsis invicta was transported to the United States, probably in ship ballast, which
often consists of haphazardly loaded soil and gravel. As a teenaged bug enthusiast, Edward O. Wilson,
the famed biologist, spotted the first colonies in the port of Mobile, Alabama. He saw some very happy
fire ants. From the ant’s point of view, it had been dumped into an empty, recently flooded expanse. S.
invicta took off, never looking back.
The initial incursion watched by Wilson was likely just a few thousand individuals—a number small
enough to suggest that random, bottleneck-style genetic change played a role in the species’ subsequent
history in this country. In their Argentine birthplace, fire-ant colonies constantly fight each other,
reducing their numbers and creating space for other types of ant. In the United States, by contrast, the
species forms cooperative supercolonies, linked clusters of nests that can spread for hundreds of miles.
Systematically exploiting the landscape, these supercolonies monopolize every useful resource, wiping
out other ant species along the way—models of zeal and rapacity. Transformed by chance and
opportunity, new-model S. invictus needed just a few decades to conquer most of the southern United
States.
Homo sapiens did something similar in the wake of Toba. For hundreds of thousands of years, our
species had been restricted to East Africa (and, possibly, a similar area in the south). Now, abruptly, new-
model Homo sapiens were racing across the continents like so many imported fire ants. The difference
between humans and fire ants is that fire ants specialize in disturbed habitats. Humans, too, specialize in
disturbed habitats—but we do the disturbing.
As a student at the University of Moscow in the 1920s, Georgii Gause spent years trying—and failing—to
drum up support from the Rockefeller Foundation, then the most prominent funding source for non-
American scientists who wished to work in the United States. Hoping to dazzle the foundation, Gause
decided to perform some nifty experiments and describe the results in his grant application.
By today’s standards, his methodology was simplicity itself. Gause placed half a gram of oatmeal in one
hundred cubic centimeters of water, boiled the results for ten minutes to create a broth, strained the
liquid portion of the broth into a container, diluted the mixture by adding water, and then decanted the
contents into small, flat-bottomed test tubes. Into each he dripped five Paramecium caudatum or
Stylonychia mytilus, both single-celled protozoans, one species per tube. Each of Gause’s test tubes was
a pocket ecosystem, a food web with a single node. He stored the tubes in warm places for a week and
observed the results. He set down his conclusions in a 163-page book, The Struggle for Existence,
published in 1934.
Today The Struggle for Existence is recognized as a scientific landmark, one of the first successful
marriages of theory and experiment in ecology. But the book was not enough to get Gause a fellowship;
the Rockefeller Foundation turned down the twenty-four-year-old Soviet student as insufficiently
eminent. Gause could not visit the United States for another twenty years, by which time he had indeed
become eminent, but as an antibiotics researcher.
What Gause saw in his test tubes is often depicted in a graph, time on the horizontal axis, the number of
protozoa on the vertical. The line on the graph is a distorted bell curve, with its left side twisted and
stretched into a kind of flattened S. At first the number of protozoans grows slowly, and the graph line
slowly ascends to the right. But then the line hits an inflection point, and suddenly rockets upward—a
frenzy of exponential growth. The mad rise continues until the organism begins to run out of food, at
which point there is a second inflection point, and the growth curve levels off again as bacteria begin to
die. Eventually the line descends, and the population falls toward zero.
Stay-at-home Homo sapiens 1.0, a wallflower that would never have interested Lynn Margulis, had
been replaced by aggressively expansive Homo sapiens 2.0. Something happened, for better and worse,
and we were born.
Years ago I watched Lynn Margulis, one of Gause’s successors, demonstrate these conclusions to a class
at the University of Massachusetts with a time-lapse video of Proteus vulgaris, a bacterium that lives in
the gastrointestinal tract. To humans, she said, P. vulgaris is mainly notable as a cause of urinary-tract
infections. Left alone, it divides about every fifteen minutes. Margulis switched on the projector.
Onscreen was a small, wobbly bubble—P. vulgaris—in a shallow, circular glass container: a petri dish. The
class gasped. The cells in the time-lapse video seemed to shiver and boil, doubling in number every few
seconds, colonies exploding out until the mass of bacteria filled the screen. In just thirty-six hours, she
said, this single bacterium could cover the entire planet in a foot-deep layer of single-celled ooze. Twelve
hours after that, it would create a living ball of bacteria the size of the earth.
Such a calamity never happens, because competing organisms and lack of resources prevent the
overwhelming majority of P. vulgaris from reproducing. This, Margulis said, is natural selection, Darwin’s
great insight. All living creatures have the same purpose: to make more of themselves, ensuring their
biological future by the only means available. Natural selection stands in the way of this goal. It prunes
back almost all species, restricting their numbers and confining their range. In the human body, P.
vulgaris is checked by the size of its habitat (portions of the human gut), the limits to its supply of
nourishment (food proteins), and other, competing organisms. Thus constrained, its population remains
roughly steady.
In the petri dish, by contrast, competition is absent; nutrients and habitat seem limitless, at least at first.
The bacterium hits the first inflection point and rockets up the left side of the curve, swamping the petri
dish in a reproductive frenzy. But then its colonies slam into the second inflection point: the edge of the
dish. When the dish’s nutrient supply is exhausted, P. vulgaris experiences a miniapocalypse.
By luck or superior adaptation, a few species manage to escape their limits, at least for a while. Nature’s
success stories, they are like Gause’s protozoans; the world is their petri dish. Their populations grow
exponentially; they take over large areas, overwhelming their environment as if no force opposed them.
Then they annihilate themselves, drowning in their own wastes or starving from lack of food.
To someone like Margulis, Homo sapiens looks like one of these briefly fortunate species.
No more than a few hundred people initially migrated from Africa, if geneticists are correct. But they
emerged into landscapes that by today’s standards were as rich as Eden. Cool mountains, tropical
wetlands, lush forests—all were teeming with food. Fish in the sea, birds in the air, fruit on the trees:
breakfast was everywhere. People moved in.
Despite our territorial expansion, though, humans were still only in the initial stages of Gause’s oddly
shaped curve. Ten thousand years ago, most demographers believe, we numbered barely 5 million,
about one human being for every hundred square kilometers of the earth’s land surface. Homo sapiens
was a scarcely noticeable dusting on the surface of a planet dominated by microbes. Nevertheless, at
about this time—10,000 years ago, give or take a millennium—humankind finally began to approach the
first inflection point. Our species was inventing agriculture.
The wild ancestors of cereal crops like wheat, barley, rice, and sorghum have been part of the human
diet for almost as long as there have been humans to eat them. (The earliest evidence comes from
Mozambique, where researchers found tiny bits of 105,000-year-old sorghum on ancient scrapers and
grinders.) In some cases people may have watched over patches of wild grain, returning to them year
after year. Yet despite the effort and care the plants were not domesticated. As botanists say, wild cereals
“shatter”—individual grain kernels fall off as they ripen, scattering grain haphazardly, making it
impossible to harvest the plants systematically. Only when unknown geniuses discovered naturally
mutated grain plants that did not shatter—and purposefully selected, protected, and cultivated them—
did true agriculture begin. Planting great expanses of those mutated crops, first in southern Turkey, later
in half a dozen other places, early farmers created landscapes that, so to speak, waited for hands to
harvest them.
Farming converted most of the habitable world into a petri dish. Foragers manipulated their
environment with fire, burning areas to kill insects and encourage the growth of useful species—plants
we liked to eat, plants that attracted the other creatures we liked to eat. Nonetheless, their diets were
largely restricted to what nature happened to provide in any given time and season. Agriculture gave
humanity the whip hand. Instead of natural ecosystems with their haphazard mix of species (so many
useless organisms guzzling up resources!), farms are taut, disciplined communities conceived and
dedicated to the maintenance of a single species: us.
Before agriculture, the Ukraine, American Midwest, and lower Yangzi were barely hospitable food
deserts, sparsely inhabited landscapes of insects and grass; they became breadbaskets as people scythed
away suites of species that used soil and water we wanted to dominate and replaced them with wheat,
rice, and maize (corn). To one of Margulis’s beloved bacteria, a petri dish is a uniform expanse of
nutrients, all of which it can seize and consume. For Homo sapiens, agriculture transformed the planet
into something similar.
As in a time-lapse movie, we divided and multiplied across the newly opened land. It had taken Homo
sapiens 2.0, behaviorally modern humans, not even 50,000 years to reach the farthest corners of the
globe. Homo sapiens 2.0.A—A for agriculture—took a tenth of that time to conquer the planet.
All living creatures have the same purpose: to make more of themselves, ensuring their biological
future by the only means available. Natural selection stands in the way of this goal. It prunes back almost
all species
As any biologist would predict, success led to an increase in human numbers. Homo sapiens rocketed
around the elbow of the first inflection point in the seventeenth and eighteenth centuries, when
American crops like potatoes, sweet potatoes, and maize were introduced to the rest of the world.
Traditional Eurasian and African cereals—wheat, rice, millet, and sorghum, for example—produce their
grain atop thin stalks. Basic physics suggests that plants with this design will fatally topple if the grain
gets too heavy, which means that farmers can actually be punished if they have an extra-bounteous
harvest. By contrast, potatoes and sweet potatoes grow underground, which means that yields are not
limited by the plant’s architecture. Wheat farmers in Edinburgh and rice farmers in Edo alike discovered
they could harvest four times as much dry food matter from an acre of tubers than they could from an
acre of cereals. Maize, too, was a winner. Compared to other cereals, it has an extra-thick stalk and a
different, more productive type of photosynthesis. Taken together, these immigrant crops vastly
increased the food supply in Europe, Asia, and Africa, which in turn helped increase the supply of
Europeans, Asians, and Africans. The population boom had begun.
Numbers kept rising in the nineteenth and twentieth centuries, after a German chemist, Justus von
Liebig, discovered that plant growth was limited by the supply of nitrogen. Without nitrogen, neither
plants nor the mammals that eat plants can create proteins, or for that matter the DNA and RNA that
direct their production. Pure nitrogen gas (N2) is plentiful in the air but plants are unable to absorb it,
because the two nitrogen atoms in N2 are welded so tightly together that plants cannot split them apart
for use. Instead, plants take in nitrogen only when it is combined with hydrogen, oxygen, and other
elements. To restore exhausted soil, traditional farmers grew peas, beans, lentils, and other pulses. (They
never knew why these “green manures” replenished the land. Today we know that their roots contain
special bacteria that convert useless N2 into “bio-available” nitrogen compounds.) After Liebig, European
and American growers replaced those crops with high-intensity fertilizer—nitrogen-rich guano from Peru
at first, then nitrates from mines in Chile. Yields soared. But supplies were much more limited than
farmers liked. So intense was the competition for fertilizer that a guano war erupted in 1879, engulfing
much of western South America. Almost 3,000 people died.
Two more German chemists, Fritz Haber and Carl Bosch, came to the rescue, discovering the key steps to
making synthetic fertilizer from fossil fuels. (The process involves combining nitrogen gas and hydrogen
from natural gas into ammonia, which is then used to create nitrogenous compounds usable by plants.)
Haber and Bosch are not nearly as well known as they should be; their discovery, the Haber-Bosch
process, has literally changed the chemical composition of the earth, a feat previously reserved for
microorganisms. Farmers have injected so much synthetic fertilizer into the soil that soil and
groundwater nitrogen levels have risen worldwide. Today, roughly a third of all the protein (animal and
vegetable) consumed by humankind is derived from synthetic nitrogen fertilizer. Another way of putting
this is to say that Haber and Bosch enabled Homo sapiens to extract about 2 billion people’s worth of
food from the same amount of available land.
The improved wheat, rice, and (to a lesser extent) maize varieties developed by plant breeders in the
1950s and 1960s are often said to have prevented another billion deaths. Antibiotics, vaccines, and
water-treatment plants also saved lives by pushing back humankind’s bacterial, viral, and fungal
enemies. With almost no surviving biological competition, humankind had ever more unhindered access
to the planetary petri dish: in the past two hundred years, the number of humans walking the planet
ballooned from 1 to 7 billion, with a few billion more expected in coming decades.
Rocketing up the growth curve, human beings “now appropriate nearly 40% . . . of potential terrestrial
productivity.” This figure dates from 1986—a famous estimate by a team of Stanford biologists. Ten years
later, a second Stanford team calculated that the “fraction of the land’s biological production that is used
or dominated” by our species had risen to as much as 50 percent. In 2000, the chemist Paul Crutzen gave
a name to our time: the “Anthropocene,” the era in which Homo sapiens became a force operating on a
planetary scale. That year, half of the world’s accessible fresh water was consumed by human beings.
Lynn Margulis, it seems safe to say, would have scoffed at these assessments of human domination over
the natural world, which, in every case I know of, do not take into account the enormous impact of the
microworld. But she would not have disputed the central idea: Homo sapiens has become a successful
species, and is growing accordingly.
If we follow Gause’s pattern, growth will continue at a delirious speed until we hit the second inflection
point. At that time we will have exhausted the resources of the global petri dish, or effectively made the
atmosphere toxic with our carbon-dioxide waste, or both. After that, human life will be, briefly, a
Hobbesian nightmare, the living overwhelmed by the dead. When the king falls, so do his minions; it is
possible that our fall might also take down most mammals and many plants. Possibly sooner, quite likely
later, in this scenario, the earth will again be a choir of bacteria, fungi, and insects, as it has been through
most of its history.
It would be foolish to expect anything else, Margulis thought. More than that, it would be unnatural.
AS PLASTIC AS CANBY
In The Phantom Tollbooth, Norton Juster’s classic, pun-filled adventure tale, the young Milo and his
faithful companions unexpectedly find themselves transported to a bleak, mysterious island.
Encountering a man in a tweed jacket and beanie, Milo asks him where they are. The man replies by
asking if they know who he is—the man is, apparently, confused on the subject. Milo and his friends
confer, then ask if he can describe himself.
“Yes, indeed,” the man replied happily. “I’m as tall as can be”—and he grew straight up until all that
could be seen of him were his shoes and stockings—“and I’m as short as can be”—and he shrank down
to the size of a pebble. “I’m as generous as can be,” he said, handing each of them a large red apple,
“and I’m as selfish as can be,” he snarled, grabbing them back again.
In short order, the companions learn that the man is as strong as can be, weak as can be, smart as can
be, stupid as can be, graceful as can be, clumsy as—you get the picture. “Is that any help to you?” he
asks. Again, Milo and his friends confer, and realize that the answer is actually quite simple:
“Of course, yes, of course,” the man shouted. “Why didn’t I think of that? I’m as happy as can be.”
With almost no surviving biological competition, humankind had ever more unhindered access to the
planetary petri dish.
With Canby, Juster presumably meant to mock a certain kind of babyish, uncommitted man-child. But I
can’t help thinking of poor old Canby as exemplifying one of humankind’s greatest attributes: behavioral
plasticity. The term was coined in 1890 by the pioneering psychologist William James, who defined it as
“the possession of a structure weak enough to yield to an influence, but strong enough not to yield all at
once.” Behavioral plasticity, a defining feature of Homo sapiens’ big brain, means that humans can
change their habits; almost as a matter of course, people change careers, quit smoking or take up
vegetarianism, convert to new religions, and migrate to distant lands where they must learn strange
languages. This plasticity, this Canby-hood, is the hallmark of our transformation from anatomically
modern Homo sapiens to behaviorally modern Homo sapiens—and the reason, perhaps, we were able
to survive when Toba reconfigured the landscape.
Other creatures are much less flexible. Like apartment-dwelling cats that compulsively hide in the closet
when visitors arrive, they have limited capacity to welcome new phenomena and change in response.
Human beings, by contrast, are so exceptionally plastic that vast swaths of neuroscience are devoted to
trying to explain how this could come about. (Nobody knows for certain, but some researchers now
think that particular genes give their possessors a heightened, inborn awareness of their environment,
which can lead both to useless, neurotic sensitivity and greater ability to detect and adapt to new
situations.)
Plasticity in individuals is mirrored by plasticity on a societal level. The caste system in social species like
honeybees is elaborate and finely tuned but fixed, as if in amber, in the loops of their DNA. Some
leafcutter ants are said to have, next to human beings, the biggest and most complex societies on earth,
with elaborately coded behavior that reaches from disposal of the dead to complex agricultural systems.
Housing millions of individuals in inconceivably ramose subterranean networks, leafcutter colonies are
“Earth’s ultimate superorganisms,” Edward O. Wilson has written. But they are incapable of fundamental
change. The centrality and authority of the queen cannot be challenged; the tiny minority of males, used
only to inseminate queens, will never acquire new responsibilities.
Human societies are far more varied than their insect cousins, of course. But the true difference is their
plasticity. It is why humankind, a species of Canbys, has been able to move into every corner of the
earth, and to control what we find there. Our ability to change ourselves to extract resources from our
surroundings with ever-increasing efficiency is what has made Homo sapiens a successful species. It is
our greatest blessing.
DISCOUNT RATES
By 2050, demographers predict, as many as 10 billion human beings will walk the earth, 3 billion more
than today. Not only will more people exist than ever before, they will be richer than ever before. In the
last three decades hundreds of millions in China, India, and other formerly poor places have lifted
themselves from destitution—arguably the most important, and certainly the most heartening,
accomplishment of our time. Yet, like all human enterprises, this great success will pose great difficulties.
In the past, rising incomes have invariably prompted rising demand for goods and services. Billions more
jobs, homes, cars, fancy electronics—these are things the newly prosperous will want. (Why shouldn’t
they?) But the greatest challenge may be the most basic of all: feeding these extra mouths. To
agronomists, the prospect is sobering. The newly affluent will not want their ancestors’ gruel. Instead
they will ask for pork and beef and lamb. Salmon will sizzle on their outdoor grills. In winter, they will
want strawberries, like people in New York and London, and clean bibb lettuce from hydroponic gardens.
All of these, each and every one, require vastly more resources to produce than simple peasant
agriculture. Already 35 percent of the world’s grain harvest is used to feed livestock. The process is
terribly inefficient: between seven and ten kilograms of grain are required to produce one kilogram of
beef. Not only will the world’s farmers have to produce enough wheat and maize to feed 3 billion more
people, they will have to produce enough to give them all hamburgers and steaks. Given present
patterns of food consumption, economists believe, we will need to produce about 40 percent more grain
in 2050 than we do today.
How can we provide these things for all these new people? That is only part of the question. The full
question is: How can we provide them without wrecking the natural systems on which all depend?
Scientists, activists, and politicians have proposed many solutions, each from a different ideological and
moral perspective. Some argue that we must drastically throttle industrial civilization. (Stop energy-
intensive, chemical-based farming today! Eliminate fossil fuels to halt climate change!) Others claim that
only intense exploitation of scientific knowledge can save us. (Plant super-productive, genetically
modified crops now! Switch to nuclear power to halt climate change!) No matter which course is chosen,
though, it will require radical, large-scale transformations in the human enterprise—a daunting,
hideously expensive task.
Humans can change their habits: people change careers, quit smoking or take up vegetarianism,
convert to new religions, and migrate to distant lands where they must learn strange languages.
Worse, the ship is too large to turn quickly. The world’s food supply cannot be decoupled rapidly from
industrial agriculture, if that is seen as the answer. Aquifers cannot be recharged with a snap of the
fingers. If the high-tech route is chosen, genetically modified crops cannot be bred and tested overnight.
Similarly, carbon-sequestration techniques and nuclear power plants cannot be deployed instantly.
Changes must be planned and executed decades in advance of the usual signals of crisis, but that’s like
asking healthy, happy sixteen-year-olds to write living wills.
Not only is the task daunting, it’s strange. In the name of nature, we are asking human beings to do
something deeply unnatural, something no other species has ever done or could ever do: constrain its
own growth (at least in some ways). Zebra mussels in the Great Lakes, brown tree snakes in Guam, water
hyacinth in African rivers, gypsy moths in the northeastern U.S., rabbits in Australia, Burmese pythons in
Florida—all these successful species have overrun their environments, heedlessly wiping out other
creatures. Like Gause’s protozoans, they are racing to find the edges of their petri dish. Not one has
voluntarily turned back. Now we are asking Homo sapiens to fence itself in.
What a peculiar thing to ask! Economists like to talk about the “discount rate,” which is their term for
preferring a bird in hand today over two in the bush tomorrow. The term sums up part of our human
nature as well. Evolving in small, constantly moving bands, we are as hard-wired to focus on the
immediate and local over the long-term and faraway as we are to prefer parklike savannas to deep dark
forests. Thus, we care more about the broken stoplight up the street today than conditions next year in
Croatia, Cambodia, or the Congo. Rightly so, evolutionists point out: Americans are far more likely to be
killed at that stoplight today than in the Congo next year. Yet here we are asking governments to focus
on potential planetary boundaries that may not be reached for decades. Given the discount rate, nothing
could be more understandable than the U.S. Congress’s failure to grapple with, say, climate change. From
this perspective, is there any reason to imagine that Homo sapiens, unlike mussels, snakes, and moths,
can exempt itself from the natural fate of all successful species?
To biologists like Margulis, who spend their careers arguing that humans are simply part of the natural
order, the answer should be clear. All life is similar at base. All species seek without pause to make more
of themselves—that is their goal. By multiplying till we reach our maximum possible numbers, even as
we take out much of the planet, we are fulfilling our destiny.
From this vantage, the answer to the question whether we are doomed to destroy ourselves is yes. It
should be obvious.
HARA HACHI BU
When I imagine the profound social transformation necessary to avoid calamity, I think about Robinson
Crusoe, hero of Daniel Defoe’s famous novel. Defoe clearly intended his hero to be an exemplary man.
Shipwrecked on an uninhabited island off Venezuela in 1659, Crusoe is an impressive example of
behavioral plasticity. During his twenty-seven-year exile he learns to catch fish, hunt rabbits and turtles,
tame and pasture island goats, prune and support local citrus trees, and create “plantations” of barley
and rice from seeds that he salvaged from the wreck. (Defoe apparently didn’t know that citrus and
goats were not native to the Americas and thus Crusoe probably wouldn’t have found them there.)
Rescue comes at last in the form of a shipful of ragged mutineers, who plan to maroon their captain on
the supposedly empty island. Crusoe helps the captain recapture his ship and offers the defeated
mutineers a choice: trial in England or permanent banishment to the island. All choose the latter. Crusoe
has harnessed so much of the island’s productive power to human use that even a gaggle of inept
seamen can survive there in comfort.
To get Crusoe on his unlucky voyage, Defoe made him an officer on a slave ship, transporting captured
Africans to South America. Today, no writer would make a slave seller the admirable hero of a novel. But
in 1720, when Defoe published Robinson Crusoe, no readers said boo about Crusoe’s occupation,
because slavery was the norm from one end of the world to another. Rules and names differed from
place to place, but coerced labor was everywhere, building roads, serving aristocrats, and fighting wars.
Slaves teemed in the Ottoman Empire, Mughal India, and Ming China. Unfree hands were less common
in continental Europe, but Portugal, Spain, France, England, and the Netherlands happily exploited slaves
by the million in their American colonies. Few protests were heard; slavery had been part of the fabric of
life since the code of Hammurabi.
Then, in the space of a few decades in the nineteenth century, slavery, one of humankind’s most
enduring institutions, almost vanished.
The sheer implausibility of this change is staggering. In 1860, slaves were, collectively, the single most
valuable economic asset in the United States, worth an estimated $3 billion, a vast sum in those days
(and about $10 trillion in today’s money). Rather than investing in factories like northern entrepreneurs,
southern businessmen had sunk their capital into slaves. And from their perspective, correctly so—
masses of enchained men and women had made the region politically powerful, and gave social status to
an entire class of poor whites. Slavery was the foundation of the social order. It was, thundered John C.
Calhoun, a former senator, secretary of state, and vice president, “instead of an evil, a good—a positive
good.” Yet just a few years after Calhoun spoke, part of the United States set out to destroy this
institution, wrecking much of the national economy and killing half a million citizens along the way.
Incredibly, the turn against slavery was as universal as slavery itself. Great Britain, the world’s biggest
human trafficker, closed down its slave operations in 1808, though they were among the nation’s most
profitable industries. The Netherlands, France, Spain, and Portugal soon followed. Like stars winking out
at the approach of dawn, cultures across the globe removed themselves from the previously universal
exchange of human cargo. Slavery still exists here and there, but in no society anywhere is it formally
accepted as part of the social fabric.
The world’s food supply cannot be decoupled rapidly from industrial agriculture, if that is seen as the
answer.
Historians have provided many reasons for this extraordinary transition. But one of the most important is
that abolitionists had convinced huge numbers of ordinary people around the world that slavery was a
moral disaster. An institution fundamental to human society for millennia was swiftly dismantled by
ideas and a call to action, loudly repeated.
In the last few centuries, such profound changes have occurred repeatedly. Since the beginning of our
species, for instance, every known society has been based on the domination of women by men.
(Rumors of past matriarchal societies abound, but few archaeologists believe them.) In the long view,
women’s lack of liberty has been as central to the human enterprise as gravitation is to the celestial
order. The degree of suppression varied from time to time and place to place, but women never had an
equal voice; indeed, some evidence exists that the penalty for possession of two X chromosomes
increased with technological progress. Even as the industrial North and agricultural South warred over
the treatment of Africans, they regarded women identically: in neither half of the nation could they
attend college, have a bank account, or own property. Equally confining were women’s lives in Europe,
Asia, and Africa. Nowadays women are the majority of U.S. college students, the majority of the
workforce, and the majority of voters. Again, historians assign multiple causes to this shift in the human
condition, rapid in time, staggering in scope. But one of the most important was the power of ideas—the
voices, actions, and examples of suffragists, who through decades of ridicule and harassment pressed
their case. In recent years something similar seems to have occurred with gay rights: first a few lonely
advocates, censured and mocked; then victories in the social and legal sphere; finally, perhaps, a slow
movement to equality.
Less well known, but equally profound: the decline in violence. Foraging societies waged war less
brutally than industrial societies, but more frequently. Typically, archaeologists believe, about a quarter
of all hunters and gatherers were killed by their fellows. Violence declined somewhat as humans
gathered themselves into states and empires, but was still a constant presence. When Athens was at its
height in the fourth and fifth centuries BC, it was ever at war: against Sparta (First and Second
Peloponnesian Wars, Corinthian War); against Persia (Greco-Persian Wars, Wars of the Delian League);
against Aegina (Aeginetan War); against Macedon (Olynthian War); against Samos (Samian War); against
Chios, Rhodes, and Cos (Social War).
In this respect, classical Greece was nothing special—look at the ghastly histories of China, sub-Saharan
Africa, or Mesoamerica. Similarly, early modern Europe’s wars were so fast and furious that historians
simply gather them into catchall titles like the Hundred Years’ War, followed by the shorter but even
more destructive Thirty Years’ War. And even as Europeans and their descendants paved the way toward
today’s concept of universal human rights by creating documents like the Bill of Rights and the
Declaration of the Rights of Man and of the Citizen, Europe remained so mired in combat that it fought
two conflicts of such massive scale and reach they became known as “world” wars.
Since the Second World War, however, rates of violent death have fallen to the lowest levels in known
history. Today, the average person is far less likely to be slain by another member of the species than
ever before—an extraordinary transformation that has occurred, almost unheralded, in the lifetime of
many of the people reading this article. As the political scientist Joshua Goldstein has written, “we are
winning the war on war.” Again, there are multiple causes. But Goldstein, probably the leading scholar in
this field, argues that the most important is the emergence of the United Nations and other
transnational bodies, an expression of the ideas of peace activists earlier in the last century.
In the name of nature, we are asking human beings to do something deeply unnatural, something no
other species has ever done or could ever do: constrain its own growth.
As a relatively young species, we have an adolescent propensity to make a mess: we pollute the air we
breathe and the water we drink, and appear stalled in an age of carbon dumping and nuclear
experimentation that is putting countless species at risk including our own. But we are making
undeniable progress nonetheless. No European in 1800 could have imagined that in 2000 Europe would
have no legal slavery, women would be able to vote, and gay people would be able to marry. No one
could have guessed a continent that had been tearing itself apart for centuries would be free of armed
conflict, even amid terrible economic times. Given this record, even Lynn Margulis might pause (maybe).
Preventing Homo sapiens from destroying itself à la Gause would require a still greater transformation—
behavioral plasticity of the highest order—because we would be pushing against biological nature itself.
The Japanese have an expression, hara hachi bu, which means, roughly speaking, “belly 80 percent full.”
Hara hachi bu is shorthand for an ancient injunction to stop eating before feeling full. Nutritionally, the
command makes a great deal of sense. When people eat, their stomachs produce peptides that signal
fullness to the nervous system. Unfortunately, the mechanism is so slow that eaters frequently perceive
satiety only after they have consumed too much—hence the all-too-common condition of feeling
bloated or sick from overeating. Japan—actually, the Japanese island of Okinawa—is the only place on
earth where large numbers of people are known to restrict their own calorie intake systematically and
routinely. Some researchers claim that hara hachi bu is responsible for Okinawans’ notoriously long life
spans. But I think of it as a metaphor for stopping before the second inflection point, voluntarily
forswearing short-term consumption to obtain a long-term benefit.
I can imagine Margulis’s response: You’re imagining our species as some sort of big-brained,
hyperrational, benefit-cost-calculating computer! A better analogy is the bacteria at our feet! Still,
Margulis would be the first to agree that removing the shackles from women and slaves has begun to
unleash the suppressed talents of two-thirds of the human race. Drastically reducing violence has
prevented the waste of countless lives and staggering amounts of resources. Is it really impossible to
believe that we wouldn’t use those talents and those resources to draw back before the abyss?
Our record of success is not that long. In any case, past successes are no guarantee of the future. But it is
terrible to suppose that we could get so many other things right and get this one wrong. To have the
imagination to see our potential end, but not have the imagination to avoid it. To send humankind to the
moon but fail to pay attention to the earth. To have the potential but to be unable to use it—to be, in the
end, no different from the protozoa in the petri dish. It would be evidence that Lynn Margulis’s most
dismissive beliefs had been right after all. For all our speed and voraciousness, our changeable sparkle
and flash, we would be, at last count, not an especially interesting species.