Read an Excerpt
What Would Darwin Write Today?
Imagine we somehow plucked Charles Darwin out of his nineteenth-century home and placed him smack in the middle of modern-day Trafalgar Square, London. What would he make of it?
After the initial brutal disorientation, this dissector of human strengths and foibles would start to observe: How can everything be so clean and orderly? Where is the soot, horse dung, squalor, gruel, stench, ailments? No fleas? And where are all the Oliver Twist–like urchins? Everyone seems so well fed—in fact, way overfed. People look familiar but also very different; where did all these tall folks come from, and why are so many large but weak? Why are there so few kids and so many old people—and why do the white-haired folk appear so healthy? He would marvel at the variety of foods, halal carts, taco trucks, schnitzel stands, and rolling organic juice bars. But he would also wonder about the signs that read PLEASE INFORM YOUR SERVER OF ANY ALLERGIES.1 Then he would focus closer: Everyone looks clean and has teeth. Children can read and have free time to play . . . but why are adults running through Trafalgar Square wearing short pants and neon shoes? Why don’t some of the old people have wrinkles? Why are many kids and elderly drawing deep breaths from inhalers?
When you know what to look for, the symptoms of rapid evolution are all around us; in just 150 years the human species has changed. We have redesigned our world and our bodies while at the same time becoming an ever more domesticated and smarter species. But taking control of our own evolution can also generate surprises; symptoms of rapid evolution include explosions in autism, allergies, obesity, and a host of other changes, not all positive ones.
Until relatively recently only a minority of humans lived in the kind of poverty depicted in the works of Darwin’s contemporary, Charles Dickens, in cities where destitution, sickness, filth, and early death were commonplace. The majority of people lived much closer to a state of nature in the brutal world Darwin described in his most famous book, On the Origin of Species by Means of Natural Selection, where nature was the ruler and driver of evolution and two key forces determined what survived and thrived on this planet: natural selection and random mutation.2
To recap Bio 101, natural selection means that you see only those species* that adapted to a particular environment well enough to reproduce generation after generation. In other words, “survival of the fittest.” If a given group of creatures does not find enough food, fend off predators and diseases, and find suitable and healthy mates, then that species goes extinct. Random mutation means that the core genetic code, the DNA that underlies our biological traits, slowly varies through chance from generation to generation.3 Usually these changes are benign and unnoticeable. Occasionally significant changes help individuals reproduce and survive better than their ancestors. But sometimes they can lead to horrible genetic diseases.
If Darwin were to write A Tale of Two Centuries, to play again on his famous contemporary, he would describe vast and unrelenting change. For nearly four billion years, nature selected what lived and died. Life forms adapted by mutating randomly so that at least a few specimens sometimes hit the jackpot and survived as environments altered, pathogens evolved, new predators emerged, and food sources changed.
The goal of this book is not to argue that Darwin was wrong; just that he is not as right anymore. Over the past century, as our species grew by billions, concentrated in cities, smartened, and domesticated itself and its surroundings, we became the fundamental driver of what lives and dies. This change is so radical that if Darwin were alive today, he would likely revise a significant part of his great works, because the basic logic of evolution has shifted away from capital-n Nature toward two new core drivers:
Darwinian logic and nature continue to define and drive the evolution of all life in places where humans do not yet impose their will—in those spaces untouched by cities, farms, parklands, and vacation homes. But these once vast tracts are now rare. Half the landmass on Earth is now covered by what humans want, not by what would naturally grow without the intervention of our species. Oceans, rivers, and lakes are depleted. In just a few centuries, we have terraformed, fertilized, fenced, seeded, and irrigated enormous sections of what was once forest, savannah, desert, and tundra to accommodate our plants, our animals, our wishes. This is unnatural selection.
In the past couple of decades, the pace of evolution ramped up fast as humans invented ways to deliberately redesign the genetic code of living organisms. We developed powerful, cheap, and rapid ways to read, copy, and edit the genetic code of bacteria, viruses, plants, animals, and humans. When we engineer cancer-prone mice and long-lived worms, we alter not just how many of a species exist but also the essential nature of the species. It is this new phase, nonrandom mutation, in which we substitute random, slow evolution with rapid, deliberate, intelligent changes, that would shock Darwin.
What thrives on Earth now depends on an evolutionary seesaw. On one side sits the full weight of nature, of the traditional forces of evolution, natural selection and random mutation, leading to extraordinary diversity, continuous extinction, and speciation. On the other side sit the wishes of a single species, H. sapiens. Darwin wrote extensively on what happens when plants and animals are domesticated and redesigned by humans, but he did not make the logical leap that should these trends continue, humans would end up extending their influence to encompass the planet and themselves, matching and then exceeding the forces of natural selection and random mutation.4
While many instinctively associate the word “unnatural” with bad stuff, our transition away from Nature toward a more gentle and weakened “nature” has been spectacularly successful and beneficial to humans. Life expectancy in the UK in 1856 was 40.4 years.5 Now, because “we are the only species to have put a halt to natural selection, of its own free will,” we live almost twice as long and more than 99 percent of our babies and children survive in the developed world.6 In domesticating plants, animals, environments, and ourselves we increased not just survival but also our quality of life.
Along this journey we acquired an awesome responsibility; as we select and design what lives and dies on this planet, we drive evolution. Our ability to read, copy, and rewrite life code now accelerates faster than Moore’s law for improvements in computers, making it ever faster and cheaper to redesign flowers, develop exotic foods, build bacteria to manufacture therapeutics, and design animals that serve and entertain.
The first part of this book showcases various symptoms, and potential causes, of rapid human-driven evolution. The second section explains the various ways we have of altering life forms and how to quickly change any species. In the third section we cover what it means to read and write life code to our own specs, and what happens as we begin to edit life forms on a grand scale. Then we discuss how we might choose to evolve ourselves and some of the ethical implications of these choices. Finally we explore using our newfound powers to revive and restore the extinct, speciate ourselves, design synthetic life forms, perhaps even leave the planet.
So what does it mean to leave a part of Darwin’s theory behind and enter a new paradigm of human-driven evolution? It means we increasingly bend nature to our own desires. Many of the sick or weak are no longer relentlessly culled by natural selection. This transition away from nature operating and guiding life forms to our doing so may just be the single greatest achievement and challenge, so far, for the human species. As a result we can begin to answer questions like: How do we want to design life? What do we want humans to look like in a few hundred years? Do we want other hominin species walking around? What should we do with all the synthetic life forms we are creating? While there are certainly many wrong and perilous answers to each of these questions, getting it right potentially means continuing to improve the overall human condition, leading to better health, longer life span, and greater control over our daily lives. There is already much discovery to be proud of, and the adventure of controlling and guiding life has just begun.
SYMPTOMS OF REAL-TIME EVOLUTION
Is Autism a Harbinger of Our Changing Brains?
The Mortality and Morbidity Weekly Report (MMWR) is in some ways a medical version of the Kelley Blue Book, the publication that provides the value ranges of used cars. The MMWR provides, in mind-numbing detail, just how many people got sick or died last week. It’s not exactly beach reading, and it’s usually as exciting as watching paint dry. But within the endless columns and statistics of the MMWR, the patient and persistent can spot long-term trends and occasionally find serious short-term discontinuities.
Physicians and epidemiologists get excited by short-term discontinuities; a sudden increase in an extremely rare tumor, like Kaposi’s sarcoma, can be a harbinger of a massive infectious disease epidemic with a long incubation period, AIDS. The dozens of patients entering the hospital with this rare tumor in 1982 grew into 75,457 full-fledged AIDS cases in the United States by 1992.1
Conditions and diseases develop and spread at different rates. A rapid spike in airborne or waterborne infectious diseases like the flu or cholera is tragic but normal. A rapid spike in what was thought to be a genetic condition, like autism, is abnormal; when you see the latter, it is reasonable to think something has really changed, and not for the better.
Usually changes in the incidence of a genetically driven disease take place slowly, across generations.2 Diseases such as cystic fibrosis or sickle cell anemia result from well-characterized DNA mutations in single genes, and the inheritance pattern is well understood: If parents carry the gene and pass it to a child, the child will be affected. Cystic fibrosis occurs in 1 of 3,700 newborns in the United States each year with no significant change in incidence over many years.3 Similarly, sickle cell anemia is a genetic disease. One of every 500 African Americans acquires the errant gene from both parents, and we can predict the incidence of sickle cell anemia with some regularity.4 You cannot “catch” these kinds of conditions by sharing a room with someone; you inherit them. If your sibling has cystic fibrosis or sickle cell anemia, then you have a 1 in 4 chance of also being sick.
Autism is diagnosed in 1 percent of individuals in Asia, Europe, and North America, and 2.6 percent of South Koreans.5 We know there is a strong genetic component to autism—so much so that until recently autism was thought to be a primarily genetic disease. There is clearly an underlying genetic component to many cases of autism. If one identical twin has autism, the probability that the other is also affected is around 70 percent. Until recently, the sibling of an autistic child, even though sharing many of the same parental genes and overall home environment, had only a 1 in 20 probability of being afflicted. Meanwhile, the neighbor’s child, genetically unrelated, has only a 0.6 percent probability.6 But even though millions of dollars have been spent trying to identify “the genes” for autism, so far the picture is still murky. The hundreds of gene mutations identified in the past decade do not explain the majority of today’s cases.7 And while we searched for genes, a big epidemic was brewing.
In 2008, when the MMWR reported a 78 percent increase in autism—a noncontagious condition—occurring in fewer than eight years, alarm bells began to go off in the medical community.8 By 2010 the Centers for Disease Control and Prevention (CDC) was reporting a further 30 percent rise in autism in just two years.9 This is not the way traditional genetic diseases are supposed to act. This rate of change in autism was so shocking and unexpected that the first reaction of many MDs was that it wasn’t really that serious. Many argued, and some continue to argue, that we simply got better at diagnosing (and overdiagnosing) what was already there.10 But as case after case accumulates and overwhelms parents, school districts, and health-care systems, there is a growing sense that something is going horribly wrong, and no one really knows why.
What we do know, because of a May 2014 study that looked at more than 2 million children, is that environmental factors are driving more and more autism cases. Whereas autism used to be 80 to 90 percent explained/predicted by genetics, now genetics is only 50 percent predictive.11 We have taken a disease we mostly inherited and rapidly turned it into a disease we can trigger. Now the chances of a brother or sister of an autistic child developing autism is 1 in 8 instead of 1 in 20.
The rapid pace of today’s human-driven evolution may not be giving humanity time to adapt and to reach a steady state within a new environment. Autism may be just one harbinger, one symptom, of our radically changing world. Almost every aspect of human life has changed—moving from rural to urban; living in an antiseptic environment; eating very different sugars, fats, and preservatives; experiencing novel man-made stimuli; ingesting large quantities of medicines and chemicals; being sedentary; and living indoors. Given so many transformations, it would be surprising if our bodies and brains did not change as a result.
The DarWa Theory Revisited . . . and a Glimpse at a New Theory
Charles Robert Darwin was a man of extraordinary courage and integrity. As a teen he intended to be a preacher, yet what he studied and observed in nature was increasingly at odds with his intended profession. Much of his family, except his grandfather, opposed his theory of evolution.1 His wife was especially concerned he would be condemned to hell. Darwin himself wrestled with serious doubts; he never even used the word “evolution” until forty-two years after he first started writing about biology.2
Over decades Darwin accumulated examples, thought, wrestled with his faith, and refrained from publishing, despite a large collection of clear evidence that creatures evolve.3 And then came an extraordinary crisis. Darwin received a letter from an obscure specimen collector working in the far reaches of present-day Indonesia. Wracked by malaria, Alfred Russel Wallace had nevertheless crystallized his observations, after decades of traveling the Amazon, Malaysia, and Indonesia, into a single letter—one that described in detail what would become the theory of evolution.
Wallace was not a “gentleman,” in a class-conscious era when only gentlemen were supposed to become scientists, gain entry into learned societies, and get published. So Wallace shyly asked Darwin whether he thought his ideas and theories were any good and, if so, might Darwin be kind enough to forward them for publication. Darwin received this single correspondence and immediately concluded there could not have been a better short abstract of his own grand theory.4
Darwin could have simply ignored the letter, dealt with a growing and serious family crisis, and immediately published what he’d been developing on his own for decades. After all, the correspondence came from a relatively unknown individual, on the other side of the planet, during a period when ships sank. And Darwin had devoted years of work and thought to his unpublished theory, having sailed to the Galápagos Islands more than 25 years earlier on the Beagle and thereafter continued his observing and collecting, as well as minutely recording, detailing, and strengthening his arguments. Instead, after a night of extreme anguish, Darwin forwarded the letter to his scientific colleagues, advocating rapid publication with full credit to Wallace. Darwin realized that this action would likely preempt his getting credit for the theory of evolution.
While Darwin had the strength of character to encourage the primacy of Wallace’s work, his friends had other ideas. Through years of discussion and correspondence, they were well aware of Darwin’s developing theory. So, unbeknownst to Darwin, they chose to place some of Darwin’s own unpublished work in the same journal, alongside the Wallace letter.
On the first of July, 1858, a small group at the Linnean Society of London, which thought it was merely attending a sleepy memorial meeting for one of its deceased presidents, first heard the theory of evolution. Almost no one realized they had just witnessed history. Wallace, still in Asia, wouldn’t find out for weeks his theory had been aired. Neither did Darwin attend the meeting; he was burying his baby son.5
Even though Wallace and Darwin co-discovered and espoused a similar theory of evolution (as we’re calling it, “DarWa 1.0”), their view of the driving mechanisms differed. Darwin primarily stressed the importance of the individual; Wallace, the environment. Their combined work built and strengthened the foundation for what most today refer to as the theory of evolution. (Darwinism as we know it synthesizes a lot of people’s work, and has been augmented, edited, refined by many; for instance, Herbert Spencer first coined the famous phrase that we associate with Darwin’s theories: “survival of the fittest.”)6
While Darwin and Wallace got a great deal right about evolution, they remained puzzled by two mysteries: What was the mechanism by which adaptations happened, and why did the fossil record seem to show that sometimes, rather than proceeding in slow, small incremental steps, evolution happened quite rapidly? Unfortunately they never met or corresponded with the third critical actor-creator of the theory of evolution. Augustinian friar Gregor Mendel was too modest and dedicated to his monastic life to focus on fame. So his key article, on the genetics of peas, the one that founded modern genetic science, remained obscure and ignored for decades (like the work of many shy, brilliant, and unassuming scientists even today).
Just imagine if Mendel and Darwin had corresponded, shared ideas, and collaborated on the implications of genetics, genes, and heredity in evolutionary change.7 The discovery of units of heredity, later called genes, which began with Mendel in 1866, offered a solution to the first of Darwin’s mysteries: the rules of how traits get passed on. But even without modern genetics included, DarWa 1.0 fundamentally transformed our view of life and its future. Throughout the early 1800s, we thought the Earth was only 10,000 years old; there was no need for evolution, plate tectonics, or other grand schemes.8 Only in the last two centuries, a minuscule amount of the time humans have been alive and conscious on the planet, have we realized that continents and mountains move, and species emerge, disappear, and change. By the time Darwin died, the generally agreed-upon age of the Earth among scientists was 100 million years. Today there is overwhelming data, from chemistry, biology, geology, astronomy, and anatomy, evidencing a 4.54-billion-year-old Earth.9
DarWa 1.0 still provides a very accurate and detailed framework for understanding life forms and their development, successes, failures, and ambiguities. So, to explain and understand some of the changes taking place around us daily, why is a modified theory—an expanded theory—of evolution needed today, a DarWa 2.0?
First, it is impossible to overlook or even underestimate the effect humans have had on the planet in just the past few centuries. A very few animals and crops, genetically selected to fulfill our needs and desires, now dominate half the world’s landmass. An area equivalent to the whole of South America is under cultivation just so we can feed ourselves and our animals. And a further 8 billion acres are used for livestock. More than 19 billion chickens and 1.4 billion cows live in highly urbanized animal environments we call “farms.”10 Despite most of us regarding these settings as rural, the animal population density often exceeds even the most crowded of our cities. The wild is becoming rural, the rural urban. Nature is not “selecting” what lives and dies in these environments. Humans are. The logic of survival, for the majority of some species, is unnatural selection. And it is the species that humans have unnaturally selected, consciously or unconsciously, that are increasingly dominating the planet.
Second, under DarWa 1.0 one would never see evolution occurring the way it’s depicted on modern T-shirts; remember those cartoons where a creature crawling out of a primordial ooze eventually begets a small mammal, then an ape, then a hunched Neanderthal, and finally a handsome human? (And that then de-evolves into a poor schlub at a computer desk.) This depiction of evolution is anathema to natural selection and random mutation because it depicts a linear, orderly, and logical progression from one model to the next. The real history and fossil record of evolution, of natural selection, looks like a complex, overlapping, messy bush—an astonishingly promiscuous, interesting, tangled, semi-chaotic web of life.11 Yes, there may be a trunk to this tree, a common ancestor, but we also observe many, many subtypes, varieties, subspecies, and new species rising and falling as environments change. There is no overall plan and logic; rather, ancestors throw the dice. Some offspring will come up sevens or elevens and be winners. Most will not. The cycle is neither predictable nor foreordained. The dice were not, until very recently, loaded.
In 1972, Paul Berg began combining genes from different organisms.12 This DNA came together not because the momma creature and daddy creature did the nasty, or because of a random mutation in a given offspring. Instead, it was the beginning of a group of lab-coated scientists deliberately taking genes from one organism and inserting them into another species.13 It was not nature randomly engineering but rather humans beginning to discover and apply intelligent design to life. Thus it is almost the exact opposite logic of DarWa 1.0.
As we simplify and deploy ever more powerful instruments to read life code, insert or delete genes, alter a species, and build organisms using synthetic biology and even compounds different from traditional DNA, we move further and further away from Darwin’s world and his theory of evolution and speciation.
In a sense, what has already occurred in terms of what lives and dies on this planet has already tipped so far toward unnatural selection and nonrandom mutation that it is not even really DarWa 2.0 but a new logic of evolution, one based on different principles and mechanisms. It is no longer just natural evolution but human-driven evolution.
Twenty Generations to Domesticate Humans
Exiled by Stalin, scientist Dmitry Belyaev ended up far from a lab or university, in the endlessness of Siberia. An experimentalist at heart, he made do with what he had and began breeding wild foxes. But he did not do this at random; each year he graded each fox as particularly friendly and tame or nasty. Only those at the extremes were chosen to breed. One-fifth of the nicest, as well as one-fifth of the most violent, were segregated and bred, generation after generation.
The offspring of the nice foxes rapidly evolved floppy ears, short tails, lighter-colored fur, a less pungent odor, and bigger heads—quite weird, given that the original nice-enough-to-breed animals were picked for behavioral, not physical, traits.1 One potential explanation is that as the adrenaline levels in tamer animals dropped, melanin fell as well, which in turn altered and lightened fur color. Looks may, at least in foxes, reflect temperament, and there may be some genetic basis to stereotypes like floppy ears.
Eventually Belyaev’s “de-wilding” experiment was so successful in modifying a wild species that a subset of these creatures began acting like Labrador retrievers, so much so that they got shipped off to the United States and sold as “gentle house pets, perfect for children.” It appears you can rather quickly breed out aggression (or, as pit bull owners know, breed it in).2
As humans evolve further and further from being an “all-natural” species, we too have quickly and dramatically de-wilded ourselves. For the most part this is a great change; the concrete urban environments that now house the majority of the human species tend to be cleaner, better lit, and safer than the jungle and rural landscapes humans abandoned. We now think it normal and natural to have a tap nearby that delivers clean water and a toilet to whisk away waste. Never mind that none of this was true for 99 percent of human history. We are now so used to our completely unnatural, human-designed environments that we simply take them for granted; they have become “human rights.”
Our domestication has occurred very rapidly, mostly in the course of just over ten generations. We used to not live in large, somewhat orderly, mostly peaceful collections of millions of beings. The wild used to be a few yards from where we slept, as did the source of most food. The natural size of our ancestral tribes was around 150 individuals. When groups became much larger than that, strife ensued and subgroups splintered off, moving a little ways away, colonizing their own little swath of savannah and beginning a new chapter of hunt-and-gather as well as a new chapter of stitch-and-bitch. Survival was far from a given.3
About 7,500 generations ago, our type, Homo sapiens, began to build, create, and pillage small villages. What we refer to as “civilization” began about 500 generations ago, with the advent of agriculture. As late as 2000 BCE, the total world population was in the tens of millions, most of them broadly dispersed.4 Eventually substantial cities began to emerge in the Fertile Crescent, China, India, the Americas, and even in Europe, but cities were uncommon. In the year 1300, less than 5 percent of England was urban; throughout the Industrial Revolution, rural was the norm. Even in 1910 only 2 out of 10 people lived in cities.5 Yet by 2007, a majority of the world population had urbanized. That means a massive, global urban migration took fewer than one hundred years—about five generations. To put this in a historical context, there have been at least 125,000 generations since the first hominins began to walk around. (By the way, hominins are humans plus their extinct ancestors, whereas hominids are hominins plus great apes and their ancestors.)
We aren’t slowing down. Globally, city populations will likely double by 2030. Within the next twelve years, China alone intends to move 250 million people from rural environments into cities.6 Accommodating these new residents will require building the collective equivalent of Tokyo, Mexico City, Seoul, New York, Mumbai, Jakarta, São Paulo, Delhi, Osaka, Shanghai, Manila, and a few dozen other major cities. For reference, consider that in the 1980s, 8 out of 10 Chinese lived in the countryside. By 2025 only 3 out of 10 will be rural residents.
Unnatural environments have been very good for humans; as we domesticated ourselves and our environments, we gradually removed the obstacles to a long life span. For most of our history, for most people, days were filled with malnutrition, disease, and violence.7 A major concern was to not get eaten. Predators of all kinds were far more common until our massive and deliberate kill-off modified our environment to such an extent that we must search really carefully to find any of the once-common big animal predators. Grizzly bears no longer pose a threat in most of the United States. Saber-toothed cats are mere fossils. Now that we have taken ourselves out of the food chain, we tend to die in our beds and in hospitals, not in the jaws of another creature.8 We still worry a little about sharks, but have unfortunately managed to turn most of the species into shark-fin soup or videos for “Shark Week.” The rare shark attack triggers global news coverage.9 (Deer crashing into cars now kill eleven times more people every year than do sharks.) It’s exceedingly rare to encounter poisonous snakes in most cities, other than a few politicians.10 Although Paleo diets and lifestyles are a growing fad, sensible people might ask why anyone would want to go back to a caveman period when only 10 percent of the population made it beyond age forty.11
By far the most dangerous and common predator remains other humans; on average we are 11,000 times more likely be killed in war today than by all the sharks in all the oceans.12 But even in this arena, despite daily mayhem and bloodshed, there is a broad and deep trend toward domestication. The number of wars and violence in general has gradually decreased almost everywhere.13 We do not expect horrible deaths as simply a matter of course. Yes, there are still horrendous incidents and ghastly regional conflicts, but the world is a far more peaceful place than it used to be, and the chances of an eighteen-year-old being required to fight in a war in his or her lifetime are at an all-time low. Even terrorism, post-2001, has all but disappeared from the United States and much of Europe.14 Which is not to say that we aren’t prone to irrational fears, extreme miscalculations of risk and manipulation; between 2007 and 2012 about 4.6 Americans died annually from terrorist attacks on U.S. soil. About 100 times more drowned in their bathtubs. But the fact is, Americans spend $400,000,000 per year on homeland security and defense per victim (which arguably by some measures is working), while we spend $9,000 per cancer victim and $80 per victim of stroke and heart disease, and $0 on bathtub drownings.15
The nature and scale of state and religious retribution against individuals also changed dramatically. Priests no longer burn people at stakes in town squares or torture them in towers. While the United States still incarcerates a disproportionate number of people vis-à-vis other developed countries, the chances of an individual suffering torture, murder, or execution while imprisoned has dropped precipitously. Europe is even further ahead. People were routinely and publicly burned at the stake, beheaded, drawn and quartered, boiled, crushed, mutilated, and hanged as a matter of grand public spectacle and entertainment. Henry VIII alone reputedly executed 72,000 during his reign. In eighteenth-century Britain, 222 offenses led to the death penalty.16 In contrast, no one has suffered the death penalty in England since 1964.
With human domestication, individual muggings and murders also dropped precipitously. When Thomas Hobbes described life as nasty, brutish, and short in 1651, 1 in 1,000 Europeans was a victim of murder.17 Today’s murder rate in the United States is twentyfold less, at fewer than 5 in 100,000.18 Even Medellín, Colombia, one of the deadliest places one could visit in the 1980s, is now a relatively peaceful city.
On the other hand, not everywhere is safe and trending safer. In 2014, Nigerian terrorist group Boko Haram kidnapped hundreds of girls in raids. This was horrifying, but on a very different order of magnitude from when slave traders would capture and kill or enslave millions. The same year, leaders of African nations voted themselves immunity from human-rights prosecutions, even by their own African Court on Human and Peoples’ Rights, which was set up to mirror the International Criminal Court in The Hague.19 Although this move angered human-rights watchers, and was a clear step backward, it was also a signal that someone is beginning to hold formerly all-powerful and sovereign dictators accountable—and that global norms and expectations are effective enough that tyrants now fear the consequences and want to insulate themselves.
At the same time that we were taming ourselves and our leaders, we also began to tame our environment. Exposure to the elements used to be a major killer. As we developed better clothes, heating and cooling methods, and comfy homes and offices, we mostly quit truly worrying. Although we still incessantly comment on and complain about the weather, with rare exceptions, its consequences are far less deadly. Communities are far better prepared for weather events, knowing what they will likely face in the next few days. Tornadoes and hurricanes, floods and droughts, and even “snowpocalypses” are described in detail and followed by the media. We now think it normal to have advance warning, but for much of human history, weather was one more brutal and constant driver of natural selection.
Beyond extreme weather, perhaps the biggest change we’ve seen has been in overall exposure and temperature flux. For most land-based species, coping with extreme temperature and weather differentials is a fact of daily life. In the space of just about one century, however, our bodies, which had evolved to deal with this flux, are suddenly living, eating, and sleeping in unchanging climates. We are coddled by temperature-controlled buildings, houses, cars, and offices, living our lives between 68 and 75 degrees Fahrenheit and sheltered from rain, wind, and snow. This lifestyle is completely unnatural and abnormal, and truly quite pleasant.
In our domesticated state, for most humans, access to food is no longer a life-or-death issue, unless it is an excess of calories. Yes, too many people still go to bed hungry or malnourished, but starvation is far from the scourge it once was. The percentage of undernourished people in developing countries fell from 24 percent in 1990 to 15 percent in 2010.20 Way too many starving, still, but in 1900, in the oldest children’s hospital in the UK, almost 1 in 5 infants died from malnutrition and gastrointestinal diseases.21 Now fewer than 1 in 100 die in this hospital from lack of food. Until very recently, even in advanced countries, finding, gathering, and consuming enough calories was a constant and daily preoccupation. Now famine is unusual and far from the massive killer it once was.
Over a couple of centuries, we as a species have changed just about every factor that would promote rapid evolution in any other species. Even relatively minor environmental adjustments can lead to extreme diversity in birds, plants, and animals—even within the same geographical area. If you took wild sheep and corralled them for a few generations on a farm, you would observe massive change and domestication. And when you move an animal, plant, or bacterial species from mostly rural environments to 70 percent urban environments in just five generations, one would expect to see extremely rapid mutations—or extinctions—as a consequence. Some species of pigeons, such as everyday city pigeons (that is, the feral pigeon, which descended from the rock dove), went from shy cliff-dwelling creatures to the dive-bombing pests of Trafalgar Square and Piazza San Marco. Meanwhile, billions of passenger pigeons did not adapt, remaining too trusting and vulnerable to hunters, and went from literally blackening the skies to extinction in about a century.
Darwin understood the consequences and implications of human-driven unnatural selection as it was applied to the domestication of plants and animals; he called this “artificial selection.”22 While studying fancy pigeon breeding, Darwin chronicled rapid species changes. Some even argue that pigeons were more important, and more conclusive, to his theory of evolution than his iconic finches. He realized that if human breeders could manipulate a single species to such an extent in captivity, then perhaps nature could also manipulate, and change, all species in the wild. Darwin just did not follow through on the logical consequences of this thought, and did not foretell that humans would soon wield enormous power over the planet, most other species, and themselves. Otherwise he too would have advocated unnatural selection as a key driver of change.23
In terms of humans, what would be surprising is if such massive changes, in every aspect of our own daily lives, did not lead to rapid adaptation and ultimately speciation. Today’s humans interact, coexist, compete, and cooperate with hundreds or thousands of others on a regular, perhaps daily, basis—not the dozens that historically drove the evolution of all previous hominins. Had we not domesticated ourselves, had we not reduced the historic levels of violence, had we not somewhat educated most children, global urbanization would have been a cross between Lord of the Flies and Mad Max. So when you wonder where, seemingly all of a sudden, the extraordinary increases in life span, intelligence, height, and other positive traits we now just take for granted, come from . . . you may wish to remember how profoundly unnatural and nonrandom we have made our environments. We have de facto domesticated ourselves just as thoroughly and completely as we have done with our own cats and dogs, with massive evolutionary consequences.24
Violence and the Lack Thereof
One of the most positive trends in human history is the persistent decline in overall violence. Given the daily news and general mayhem reported, this trend is hard to see, but we are—in general, and with notable exceptions—a far more peaceful and domesticated species than we once were. Italian homicide rates have steadily declined from 73 per 100,000 in 1450 to 2 per 100,000 in 2010.1 The same is true in country after country. Violence used to be a core and unremitting part of most humans’ experience. We are now non-naturally (and happily) engineering it out. Much of Europe, which over the past 32 centuries has been the most war-loving region on the planet, has given up investing in functioning, real armies.2
We are in the midst of an epidemic of nonviolence, which has the effect of casting one of the greatest sources of natural selection aside. As peace and prosperity spread, as well as human rights and women’s rights, our chances of dying violently have collapsed and our chances of passing on our genes skyrocketed. Within the United States, if one parses the fifty categories in which one might inflict violence on children and youths, not a single one increased between 2003 and 2011.3
And we are also choosing far more diverse mates. Tolerance of interracial couples as well as of people with different beliefs and religions is on the rise. This is a really recent phenomenon; until June 12, 1967, in seventeen U.S. states it was illegal to marry someone from a different race. It took a Supreme Court decision, in the most appropriately named Loving v. Virginia case, to invalidate the previous jurisprudence, including a ruling by a Virginia judge who stated, “Almighty God created the races of White, Black, Yellow, Malay, and Red, and He placed them on separate continents . . . The fact that He separated the races shows that he did not intend for the races to mix.” (The judge’s ruling, reasoning, and language were further affirmed by the Virginia Court of Appeals.)
Only in 1991 did the majority of Americans agree with the statement that interracial marriages are OK.4 And race is far from the only barrier. Even at the beginning of the 2010s, more than 86 percent of U.S. relationships are between people of the same faith. Mormons, conservative Christians, Muslims, and Sikhs are especially wary of relationships with those of other faiths.5
But the overall trend toward more openness and acceptance of others, especially among the young and educated, is overwhelming. Travel and study-abroad programs abound and social media has exploded, providing opportunities to interact with far more people than our grandparents ever dreamt of. Among the many benefits of multiculturalism, however, there may also be losses. For example, in the 1900s, about one-half of Americans were blue-eyed. By 1950, only one-third. By the end of the twentieth century, fewer than 1 in 6. Blue eyes are a recessive trait, and whereas 80 percent of people used to marry within their ethnic group, now they tend to marry based on where and how long they go to school.6 Imagine saying someday, “Once upon a time there were people with blue eyes.”
Eye color may not be the only trait we are losing in an effort to domesticate ourselves and our behavior. As we cram together in ever-expanding cities, it’s worth asking whether we would want to chemically tame or untame ourselves. We can alter trends in both directions. Want to make men more macho? Between 2000 and 2011, in 37 out of 41 countries surveyed, monthly sales of testosterone increased, and increased, and increased.7 (Walmart sells testosterone boosters for $8.98.) What does testosterone do to the body? Supposedly a lot of good things—more muscle mass, increased sex drive, greater confidence, and perhaps . . . more aggression? The underlying cause of why someone blows up in a violent rage or bites off part of their own tongue is, as baffled scientists like to say, “multifactorial.”8 In essence, it is hard to sort out causality, because there are so many variables: Was it the testosterone, steroids, booze, or a sad country song that set off the sudden rampage? Certain chemical use is correlated with violence. In peaceful Sweden, whenever police jailed someone and then sent urine samples to labs, 33.5 percent of those samples came back positive for steroids.9
Now violence is uncommon, but for most of modern human history, right up until the last few decades, we periodically caused, or were subjected to, massive displacements of large populations: Famine, disease, and war drove opportunities for natural selection. While ugly and heartless, such events also drove human genetic diversity in many closed and isolated communities. Genomes record the history of exactly who had sex with whom and when, as well as where one’s ancestors came from. As gene sequencing gets ever cheaper, we are seeing, in granular detail, the genetic makeup of people from each country, region, and tribe. For example, today’s average Maya gene admixture includes: Colombian, Pima, Karitiâna, Spanish, Surui, Uzbek, Irish, Japanese, and Yoruba.10 And you can get far more personal: In 1998, Dr. Eugene Foster greatly upset Southern gentility, and possibly even the Daughters of the American Revolution, tracing DNA from descendants of Thomas Jefferson and his black slave Sally Hemings to prove they had several children together.11
Race blending wasn’t uncommon in an era of slavery, exploration, and conquering. But what happens now as peace prevails and some human niches isolate while maintaining their traditions? Consider the area around the Arabian Peninsula: The cultural mandate that women not stray far from the family circle when seeking a husband, as a way to ensure their honor, has resulted in an extraordinary number of close-relative intermarriages. Studies estimating consanguinity in Egypt range from 20 percent to 42 percent. In other Arab countries and regions, rates of consanguinity can reach 60 percent.12
People once mostly lived in closed, isolated, rural enclaves and tribes. The normal state of affairs, for much of the peninsula’s history, was that you could and did marry close relatives. Many of these tight familial clusters were periodically disrupted by outsiders and invaders. But now that the region has enjoyed a period of relative peace for much of the past century, stability—plus extreme cultural strictures—may have accentuated three trends affecting the genetic makeup of the population. First, if you continue to marry first cousins and second cousins, and have few interactions with outsiders, you tend to get less and less gene variation. If a bad gene enters that reduced pool, it takes longer to breed out. (The converse is also true: that a rare beneficial mutation can propagate more quickly through a small inbred population, perhaps enhancing survival or reproduction.)
A secondary effect of the lack of violence, chaos, and disruption on a population—as well as more modern medical care—is that a lot of children with birth defects, who likely wouldn’t have survived harsh nomadic conditions, now do survive and have kids themselves. Because of humane choices and policies, traits that might have been bred out under natural selection are reinforced and passed on.
And a tertiary effect of nonviolence: Many people manage to get wealthier. The temptation to keep growing wealth and status within the family, or at least within the tribe, may further reinforce the incentive not to marry outside that group, much less a commoner or a foreigner.
Consanguinity doubles and triples the risk of birth defects.13 These days 8 percent of all kids born in Saudi Arabia have severe genetic defects (about three times the rate in European countries).14 Type 2 diabetes affects 32 percent of adults. Hypertension hits 33 percent.15 In the United Arab Emirates, congenital anomalies account for 40.3 percent of infant mortality. In Qatar, high rates of consanguinity have led to a higher incidence of rare genetic diseases, as well as increases in “common adult diseases like cancer, mental disorders, heart diseases, gastrointestinal disorders, hypertension, and hearing deficit.”16
Even extremely rare genetic defects can be passed on and spread quickly within a specific undisrupted societal subgroup, which is precisely the problem that afflicted the royal families of Europe in the nineteenth and twentieth centuries; hemophilia became known as the “Royal Disease” after Queen Victoria passed it on to two of her daughters, her son, and the royal houses of Spain, Germany, and Russia. (Blessed be Kate Middleton, commoner.) An extreme example of long-term effects is the inbreeding that took place for six centuries among the Hapsburgs, who eventually developed “Hapsburg lips”—extreme malformations that prevented one of them from chewing food.17 Hapsburgism may now be occurring across whole nations. So, as we engineer massive, positive change in the form of nonviolence, we may also want to revisit culture and its long-held norms.
Allergies: Another Harbinger of Our Evolving Bodies?
Think back to some of your first and most cherished memories. Perhaps some of them had to do with food. Coffee brewing, Mom singing in the kitchen, preparing something you loved: cookies, bread, pudding, a cake, or a roast. You would spy on dinner parties, hiding behind doors and staircases, just to get glimpses of the guests, hear snippets of conversations. You overheard a lot, learned a lot: dress, décor, speech, drinking and smoking habits. But can you ever recall your grandparents or parents asking each and every guest whether they had any food allergies?
At a recent dinner of very prominent scientists, one answer to Juan’s question “Any allergies?” was, “I usually never eat at anyone’s house, because I am so allergic to almost everything. The only thing I could conceivably eat is a piece of meat or chicken, cooked in a saucepan, with no butter, oil, spices, or condiments of any kind. I’ll also have a glass of water or two.”
Hospital visits related to food allergies tripled between 1998 and 2006. More than 17 million Americans suffer from food allergies. A host not only has to be courteous in asking every guest about allergies, but he or she also has to be mindful of the serious consequences of not getting it right. So too does every restaurant, school cafeteria, and food cart. In kids, reported food allergies increased by 50 percent in just over a decade, from 3.4 percent (1997–99) to 5.1 percent (2009–11).1 One in ten preschoolers is now allergic to at least one thing.2 The overwhelming prevalence of allergies, plus fear of lawsuits, led Dunkin’ Donuts to simply post the following message: “Dear Valued Customers: Please be advised that any of our products may contain allergens.”3
It would be one thing if all of these allergies were directly triggered by radically new chemicals, food colorants, or preservatives (More red dye #40 in your Jell-O, my dear?). But for the most part, it’s not artificial flavorings and additives that people react to so violently. Oddly enough, seven common foods that we have been consuming for millennia account for 90 percent of food allergies: milk, eggs, nuts, fish, shellfish, soy, and wheat.4 And speaking of going nuts . . . one survey shows the incidence of adult nut allergies has remained stable from 1997 through 2008. But for kids under eighteen, allergic reactions have increased 3.5 times, from 0.6 percent to 2.1 percent of the population.5 How can kids become so sensitive to foods that have been around for ages in just one or two generations? Are kids simply exaggerating? Could hives, itching, and closed airways be simply figments of our collective imagination? Are we overdiagnosing allergies, the same as some say we are doing with autism and ADHD? Or is this “new normal” a sign of something far more profound?
Many of the world’s best scientists have been racking their brains to figure out the puzzle of allergy prevalence.6 Among today’s multitude of alternative causal hypotheses, one of the most popular explanations is the “hygiene hypothesis.”7 Its supporters argue that we have unnaturally designed and domesticated our surroundings and environments to such extremes that they are now too clean, protected, and isolated. Advocates of this hypothesis point out how kids raised on farms near animals, dirt, and microbes rarely get allergies.8 And kids living in homes with an income twice that of the poverty line are more likely to suffer food and respiratory allergies than poor kids.9
The overall explanation, says the hygiene hypothesis medical sub-tribe, is that our immune system, which developed over millennia to cope with eating raw meat off of dirt floors, has ever less to attack in our overwashed, scrubbed, sanitized, de-bacterialized environments. So our own defense mechanisms, having ever less to focus on, become ever more sensitive to the smallest perturbations. This makes sense. But before we revert to the “good old days” we should remember where we came from and what led to this new body state. . . .
We are a species that evolved consuming raw meat, uncooked and unwashed veggies, and dirty water. Even today, when the Hadza people in Tanzania catch an impala, they field dress the animal while “washing” their hands using the digested grass inside the animal’s stomach; they then invite their guests to join the feast as they “slice the stomach into pieces, and toss [pieces] into their mouths like popcorn.”10 All too often, when one travels in many countries one sees mothers taking water from severely contaminated streams or babies bathing in ponds alongside animal waste. Living with a lot of dirt and microbes has been, and for billions still remains, normal, natural, and dangerous.11
Many of those who yearn for a lost “all-natural” food system often forget that “organic” doesn’t necessarily mean safe. For much of history death came from “natural” foods and unprocessed water. It still does in parts of the developing world. Diarrhea kills close to 1.9 million kids under the age of five worldwide every year, accounting for about 19 percent of all childhood deaths.12 According to the United Nations, dirty water still kills more people than all wars.13 Only very recently have the majority of humans on Earth left extreme poverty and gained some access to clean water, sewage collection, and shrink-wrapped, preserved, and safer foods. Unnatural sanitizers and cleaning products, modern soaps, disinfectants, and chlorination have saved billions of lives.
We may have tipped the balance a little too far. Some of the very changes that make our lives cleaner, safer, and better—changes that have increased our very survival rate as a species—bring along unintended consequences. So we sneeze, itch, and wheeze ever more. Fortunately, we can readily reach for yet another unnatural, but often effective, solution—Zyrtec, Claritin, Benadryl, and other assorted antihistamines. But as we seek pharmaceutical assistance, we may also wish to consider the latest allergy theory: They are caused because we wiped out part of our symbiotic microbes.
Our Unnatural “All-Natural” World
Much of what we eat today, often in large quantities, isn’t exactly what one could call all-natural. And if you really are what you eat, then we are already quite a different species. Bodies that for hundreds of thousands of years ate “all-natural” have been challenged to adapt fast to tidal waves of nachos and pizza.
Dental plaque provides a small window through which to view this massive evolutionary upheaval.1 Anyone who has been to the dentist knows how tough it is to remove plaque. Bad for you, but good for science. Its toughness makes plaque a great reservoir of data for bioanthropologists. Diet affects plaque, and by comparing the plaque in ancient and modern human teeth, scientists can infer what kinds of things we ate and what lived in our mouths. In the pre-Twinkie era, both early humans and our close relatives had mouths that were quite healthy. There are almost no examples of Neanderthal cavities. Paleolithic and Mesolithic human skulls are almost devoid of cavities.
As human diets began to modernize, as we began cooking and cleaning more of our daily foodstuffs, a strange thing happened: The bacterial colonies in our mouths became far less diverse. Hunter-gatherers from seven thousand years ago had far more microbial diversity in their mouths than did Stone Age agriculturalists.2 Bacteria that had coexisted and coevolved with our bodies and diets, that had adapted, were crowded out by a new environment, and our mouths became colonized by nastier bacteria. We further repopulated our mouths with the ever more widespread use of processed sugars. The incidence of cavities exploded. We began to suffer chronic oral disease, something that became most bothersome, and sometimes even deadly, in the pre-antibiotic, pre-brushing, pre-dentist era.
These days we need to do things no self-respecting Neanderthal would have considered: brush our teeth three times a day, floss, drink fluoridated water, fill cavities, and use dentures. (It’s worth noting that these practices are far from common, or necessary, in any wild animals.)
Our changing diet wasn’t just hard on our mouths. Average male height during the ninth to eleventh centuries was just below that of modern men. But the transition into the Middle Ages, the Enlightenment, and the Industrial Revolution was brutal. Disease, wars, serfdom, and filthy cities changed the morphology of men; by the 1700s, the average Northern European was 2.5 inches shorter than before and did not recover until the twentieth century.3
Evolving diets redesign our bodies in other ways. The average U.S. male increased his weight from 166 pounds in 1962 to 191 pounds by 2002. By 2011 we had gained, on average, a further five pounds.4 If we had observed this kind of generalized transformation in the bodies of a wild species, we would be shocked. But Darwin and his theories could have predicted much of this change; indeed, he saw hints of it in his studies of animal husbandry and how a species rapidly changed and adapted under human-directed breeding. So he would not have been terribly surprised observing the changed morphology of a gaggle of modern humans as they perambulate by with their monster soft drinks in hand.
In altering our desires and diets we did not just alter our own bodies, we also guided the evolution of broad swaths of nature. Think about it: just what exactly is “all-natural” food? Consider the humble potato. When conjecturing about the remote origin of his French fry, a teenager might blurt out Ireland or Idaho.5 Both would be wrong. Anyone who answered the old Inca Empire would be right.6
There’s good reason and precedent for humans to tinker with the evolution of our foodstuffs. Pre-Tiwanaku (the ancestors of the Inca), only about two hundred all-natural potato species existed, some of which were poisonous. Potatoes are related to deadly nightshade, which is why Indians would dunk tubers in a water-and-clay mixture; the clay would bind with and absorb the poisonous solanine and tomatine.7 In pre-Inca times, when food was quite scarce, eating potentially poisonous food was worth the risk; few crops grew on those steep, terraced hillsides, particularly edible plants that survived tropical and subtropical pests and insects.
Gradually the Peruvians’ ancestors learned how to pick, clean, cook, and de-poison potatoes, and learned how to breed them to meet their needs. By the time the Spaniards showed up to rape and pillage, the original two hundred varieties of all-natural potatoes had proliferated into more than three thousand varieties. Many of these you can still sample in Cuzco; its municipal markets are full of bright-yellow, purple, pink, dried, bright-white, starchy, fibrous, and crimson potatoes to snack on—a far cry from the dirty-looking brown, dowdy tubers we tend to see in Western supermarkets.8
And just what’s natural about having potatoes grow in Europe, anyway? They are in no way a native species.9 But boy, have they been useful. The Irish and Germans avoided famine for centuries thanks to this Incan import. However, they reverted from the cornucopia of Incan breed varieties toward a few standardized seeds—a costly error known as monoculturing. If you monoculture, and end up depending primarily on the “Irish Lumper” variety of potato, a single blight can wipe out a lot of plants and a lot of people, which is precisely what happened during the Gorta Mór, the Great Hunger, in Ireland.10 Lack of potato-disease resistance, and lack of alternative species, cost the country almost a quarter of its population; a million starved and almost a million emigrated. An equivalent disaster in the United States today would mean losing 80 million citizens. (Not that we have all learned this lesson . . . Just in case you don’t get bulletins from the Dairy Cattle Reproduction Council, it turns out that inbreeding of Holsteins, Jerseys, and Brown Swiss is at an all-time high.11 And Florida orange growers are also relearning the perils of monoculture as their crop withers away, a victim of a citrus plague.)12
A fun and ironic field trip involves going to “all-natural, organic” farmers’ markets and counting the number of completely unnatural, human-designed varieties of produce. Those wonderful, multicolored, oddly shaped, different-size heirloom tomatoes? Why, surely they’re “all-natural.” Well, not exactly. Tomatoes used to be small, green, and slightly poisonous. Summer farmers pride themselves on the variety of shapes, colors, sizes, and flavors they can coax into being. But this process is human-driven, not natural selection at work. Extreme tomato engineering begat some great specimens and some horrid options as well. Next time you bite into a gorgeously red but utterly tasteless big-box-store tomato slice, you can blame the lack of SIGLK2, the genetic switch that generates a tomato’s naturally sweet taste. It’s been turned off through breeding in most commercial varieties, in favor of that alluring bright-red skin.13
And while we’re on the topic of “all-natural” plants . . . How often do you ship your loved one true wildflowers? Odds are you tend to send weird-looking, colorful, fragrant, newly bred, long-lasting flowers unnaturally designed to delight the object of your affection. As you continue wandering through the “organic” market, how about those artisanal cheeses? Might these be the result of centuries of unnatural human tinkering too? When was the last time you found “wild cheese” in the forest? Even the animals that produce the basic ingredients for cheese are unnatural; should you truly seek a partly all-natural Brie or Camembert, it best be made with aurochs’ milk (which might be a little hard since the last aurochs died in Poland in the early 1600s).14 Or how about those beautiful “sheep’s milk” cheeses? Make sure they’re from a mouflon or Orkney, as all current domesticated varieties of sheep are the product of our unnatural meddling.15 We propagate hundreds of examples of useful nonnative species; otherwise there would be no corn in Europe, no horses in America. Britain alone hosts 1,800 nonnative plant species, which provide many staples at “all-natural” food markets.
Many nonnative species have hybridized with native species, enormously increasing variety and number. We have so modified, bred, molded, shaped, and distorted the “all-natural” to our own tastes and purposes that we forget, or take for granted, that much of what we live with, eat, and desire has evolved into something quite different from all-natural. In the words of ecologist Chris Thomas, “Nothing is entirely natural any longer.”16
Our eco-engineering and unnatural practices also lead to unnatural selection throughout the world’s oceans. Ever go snorkeling in the Mediterranean—say, near the Spanish, French, or Italian coasts, or the Greek Isles? Beautiful sea, some nice rocks, but no fish, coral, or much else that’s alive. If something moves, it tends to be very, very small. The sheer scale and breadth of our harvesting and its destructive effects has been staggering. We selected out almost all the fish.
And if you put on a scuba tank in January and take a brisk dive off Monterey, California, it’s clear, cold, and breathtakingly beautiful; forests of kelp sway as seals whiz by. You can pick up an octopus and watch birds dart past seeking fish thirty feet under. Countless starfish and anemones wave at large fish. But you don’t see any large sardine-bait balls. Which is strange for two reasons: This area is one of the most protected, and earliest-protected, eco-coastlines on the planet. Second, if you’d expected to see anything at all in what was once the “sardine capital of the world,” it would be lots and lots of sardines. That’s what fueled John Steinbeck’s novel Cannery Row. But today the town, despite great urban renewal and tourism focus, plays host to broken docks, few fishing jobs, and almost no sardines.
There are two parallel evolutionary systems at work off Monterey and in the Mediterranean. A natural one in which humans, the predators, decimate their prey—in this case fish. And an unnatural one in which humans use what they have learned to sometimes consciously redesign an ecosystem, often too late, to preserve and protect. If you are a big enough species and can get away with it, the first is normal and natural; the second is human and unnatural. A lot of species come back from the brink not because they adapt and adopt but because we will it; often we find them cute and they have a celebrity advocate.
Our attempts at eco-engineering and unnatural selection can also beget very rapid evolution.17 In an attempt to prevent massive factory boats from simply dragging and dredging the bottom of the waters and killing all creatures, or netting all things in the water regardless of their usefulness, various countries have put minimum size requirements in place for fishing. The theory: Let creatures grow till they can reproduce, then harvest them. Good idea, except that when you drive evolution in such a cut-and-dried way, in a matter of just a few generations the most successful survivors and reproducers are the runts.18 As more and more runts survive and reproduce, dwarf fish become the norm. This in turn changes not just the ecosystem of the specific semi-protected species but also those of its predators, symbionts, and co-dwellers.
We dominate so much that we can push unnatural selection (and rapid propagation) in some really unlikely environments. If you sail by the cold, foggy islands of Maine, in addition to fog, hidden rocks, and strong tides, you also have to cope with the extreme proliferation of colorful lobster buoys. Each year more appear. One would have thought that not a lobster would be left, but the opposite’s true. The past few years have seen an 80 percent increase in catch, and a halving of price. A combination of global warming, abundant food in traps, size limits, depletion of cod and other ground-feeding fish, and fisheries management has exploded the lobster population.19
One glimpse of how unnatural the world (and how important human-directed evolution) has become can be found while examining the patterns of species extinction. Humans are truly handy at wiping things out. Within a few thousand years of arriving in Australia, we killed every land animal, reptile, and bird weighing more than 220 pounds: fifty-five species, erased.20 The current rate of extinction is estimated to be 1,000 to 10,000 times higher than the normal rate of 1 to 5 species per year.21 Even the poor frog named in Darwin’s honor is near kaput.22 But again, while many species collapse because of our actions, we are also creating/driving/begetting an unprecedented number of new species to satisfy our hunger for food, beauty, and companionship.
Just over a decade ago some scientists predicted one-third to one-half of all land animals would go extinct due to human climate tinkering.23 Now some of these same folks witness and document a surge of new species entering the niches left by disappearing creatures.24 So while we are in the throes of a great extinction, human design and accident have also led to an enormous explosion in variety and varietals.25 Cal State’s Madhusudan Katti has documented how urban encroachment sometimes leads to the extinction of many native species, but he also found rapid adaptation and the flourishing of various species in their new urban environments; one-fifth of known bird species now also inhabit urban areas.26 Many birds develop a different song in order to cope with background urban noise; San Francisco’s white-crowned sparrows are now far noisier. Migration patterns also change. European blackcap warblers now just stay in southern England rather than flying off to Africa. Eating habits, weight, and appearance alter as birds adopt new behaviors and food becomes constantly available in outdoor cafés. Mammals also adapt; Drs. Emilie Snell-Rood and Naomi Wick took skulls from ten mammals and minutely measured the differences in brain size between those that lived in the country and those that lived in the Twin Cities. Two of the city species, white-footed mice and meadow voles, which have short reproductive cycles, grew bigger brains than their rural cousins.27 Apparently, at least for small mammals, it takes extra brains, guts, and street smarts to survive in the city.
Consider dogs. The closer a dog is to a coyote, or a wolf, the more “natural” it is.28 Curiously, one of the closest things to a wolf, in people’s homes, is a pet Afghan.29 These big, thin, hairy, energetic animals, originally bred to hunt tigers, are now shampooed, combed, ribboned, and pampered to show off in Manhattan. As we get further and further away from all-natural, we deliberately cultivate a crazy quilt of breeds, each designed for our own pleasure and purposes. Dog genetics is big business. We buy our pets for very specific purposes: to hunt, guard, look adorable in a Chanel bag, to make us look more macho. Buyer’s choice. Successful human-driven pet designs tend to get reinforced and reaffirmed rapidly, through rebreeding. Breeds or mixes that do not please us rarely re-wild, much less survive and thrive; instead, they either gradually die out, mongrelize, or end up euthanized in shelters. Often what we like and desire is completely unnatural; let a dozen Chihuahuas or Lhasa Apsos loose on the African plain and watch what happens. Likely not many, if any, would survive. So exactly what about today’s average dog is natural selection?30
As we engineer and groom Fido’s successors, we sometimes see how overdesign, inbreeding, and monoculture can have consequences. Some Dobermans lick their own flanks until they bleed.31 Many bull terriers never seem to stop chasing their tails, which may be due to a malfunctioning CDH2 gene (research on this trait may one day also help illuminate a source of OCD in humans).32 So we continue to tweak breeding, including tracking genes, in order to fix or create an even cuter set of Labradoodles. Dogs are a perfect example of the two new core drivers of modern evolution: unnatural selection and nonrandom mutation. Man’s best friend is a reflection of our genetic whims and desires. But of course it’s not just dogs we are changing, and it’s not just veggies, farms, and pets we alter; we are also fundamentally changing ourselves.
Fat Humans, Fat Animals: Another Symptom?
There’s rumored to be another great big trend out there, perhaps one you may have pondered as you sat in the middle seat on your last flight. Just a few folks are getting just a little fatter, fast. Globally, obesity nearly doubled between 1980 and 2014.1 One in three Americans is now clinically obese (the definition of which is a body mass index, or BMI, over 30).2
We now witness a historical turnaround, as obesity kills more people than does malnutrition.3 The number of really fat folks in the developing world exceeds those living in rich and developed lands; obese Mexicans are now a greater percentage of the population of Mexico than obese Americans are of the United States.4 A quick-and-dirty explanation has been offhandedly attributed to supersized soft drinks and fast food. Certainly that’s a significant part of the story, given that a drink in a movie theater can be equal to a quart of syrup water, with the equivalent of twenty-seven sugar cubes dissolved inside.5 But it is not just food that has changed.
Obesity also results from too few steps.6 The substitution of machines for muscles is a trend that started a long time ago with water works, accelerated with the Industrial Revolution, and became dominant in the last half century. In 1960, at least half of all U.S. private-sector jobs required at least moderate physical activity. By 2010, fewer than 1 out of 5 jobs required physical exertion. Very few of us have mules outside to help us do our work, and most of us are nowhere near as muscle-bound as our ancestors were a century ago. In rural areas, where hard manual labor was always the norm, the rapid adoption of tractors, combines, mowers, pickup trucks, and other substitutes for backbreaking work has often led to acute obesity, even when farmers are compared with quite stout urban dwellers.7 The average U.S. male now burns 142 fewer calories per day, during work hours, than his grandfather did. This factor alone predicts a 16 percent greater body weight for adult males today compared to 1960.8
As a species we have been conducting a large-scale experiment, over a very few decades, as to what happens to a human body when it massively increases its average weight—a trend that may have long-term evolutionary consequences. Fat mothers tend to have fatter babies.9 Older mothers tend to have fatter babies; every five-year increase in Mom’s age increases the chances that her newborn will become obese by 14 percent. Some medical practices are now seeing 16 percent obesity in infants.10 Problem is, kids born obese, or who gain weight early in life, are also prone to adult obesity and far more chronic illnesses.11 And they, in turn, are likely to have fatter babies.
For millennia, a chubby baby was a good sign, but too much of a good thing tips the scales (heavy-handed pun intended). A study of 25 million children across twenty-eight countries timed how long it takes them to run one mile; on average they were a minute and a half slower than their parents were at the same age.12 Every decade, average global physical fitness falls 5 percent. (Yikes!) We are, without a doubt, eating too much and working out too little. Got it. But what if there were other human and environmental factors that might also be driving the extreme obesity epidemic?
One clue lies within recent insurance claims . . . for dogs and cats. During 2007, one provider’s health-care claims related to obesity rose 19 percent.13 Six years later, the percentage of overweight dogs had increased by an additional 37 percent. Overweight cats? More than a 90 percent increase in seven years.14 Most of these animals are typically not eating or drinking the same fast food we are. Many are on quite controlled and standardized diets. So why is obesity exploding in this population as well? Perhaps we simply coddle and overfeed our furry domestic friends.15 But if that’s the case, then why does a broad-scale animal obesity study show that in both males and females, across twelve species, everything, everywhere, is getting fatter?
One might understand why city rats, Dumpster-diving for fast food and sugary drink, might begin to look like Porky Pig. But why would the same phenomenon be occurring in lab mice and rats? One of the key rules in science research is to reduce the variables, keep things standardized so you can really compare results. If lab animals are consistently treated and fed the same way, why would the odds of a male lab macaque being obese increase by 86 percent between 1971 and 2006? Obesity odds in female lab macaques increased by 144 percent.16
Were one tempted to dismiss these findings as simply the result of changes in lab food and practices, then one might ask why the obesity index for pasture-fed horses has increased 19 percent. Is the grass in these fields really different, or is there something globally systemic going on?17