The Best American Science and Nature Writing 2004by Tim Folger
Since its inception in 1915, the Best American series has become the premier annual showcase for the country's finest short fiction and nonfiction. For each volume, a series editor reads pieces from hundreds of periodicals, then selects between fifty and a hundred outstanding works. That selection is pared down to the twenty or so very best pieces by a guest editor… See more details below
Since its inception in 1915, the Best American series has become the premier annual showcase for the country's finest short fiction and nonfiction. For each volume, a series editor reads pieces from hundreds of periodicals, then selects between fifty and a hundred outstanding works. That selection is pared down to the twenty or so very best pieces by a guest editor who is widely recognized as a leading writer in his or her field. This unique system has helped make the Best American series the most respectedand most popularof its kind.
The Best American Science and Nature Writing 2004, edited by Steven Pinker, is another "provocative and thoroughly enjoyable [collection] from start to finish" (Publishers Weekly). Here is the best and newest on science and nature: the psychology of suicide terrorism, desperate measures in surgery, the weird world of octopuses, Sex Week at Yale, the linguistics of click languages, the worst news about cloning, and much more.
Read an Excerpt
Horace’s summary of the purpose of literature, “to delight and instruct,” is also not a bad summary of the purpose of science and nature writing. The difference is not so much that a science essay gives more weight to the second inﬁnitive as that it unites the two. The best science writing delights by instructing. A good science essay, like any good essay, must be written with structure and style, but the best science essays accomplish something else. They give readers the blissful click, the satisfying aha!, of seeing a puzzling phenomenon explained.
A good example of what I have in mind comes from my days as a graduate student. Not from an experience in graduate school but from an experience living in the kind of apartment that graduate students can afford. One day its antiquated plumbing sprang a leak, and an articulate plumber (perhaps an underemployed Ph.D., I feared) explained what caused it. Water obeys Newton’s second law.Water is dense.Water is incompressible. When you shut off a tap, a large incompressible mass moving at high speed has to decelerate very quickly. This imparts a substantial force to the pipes, like a car slamming into a wall, which eventually damages the threads and causes a leak. To deal with this problem, plumbers used to install a closed vertical section of pipe, a “pipe riser,” near each faucet. When the faucet is shut, the water compresses the column of air in the riser, which acts like a shock absorber. Unfortunately, Henry’s Law applies: Gas under pressure is absorbed by a liquid. Over time, the air in the column dissolves into the water, which ﬁlls the pipe riser, rendering it useless. So every now and then a plumber has to bleed the system and let air back into the risers, a bit of preventive maintenance the landlord had neglected. It may not be the harmony of the spheres or the grandeur in this view of life, but the plumber’s disquisition captured what I treasure most in science writing: the ability to show how a seemingly capricious occurrence falls out of laws of greater generality.
Good science writing has to be good writing, and another graduate school experience led me to appreciate its ﬁrst priority, clarity.
The great Harvard psychologist Gordon Allport had died years before I entered the program, but he had written an “Epistle to Thesis Writers” that was still being handed down from generation to generation of doctoral candidates. Allport tried to steer students away from the clutter and fog of professional science prose and offered as a model an essay by a ten-year-old girl, who, he wrote, merited a higher degree “if not for the accuracy of her knowledge, then at least for the clarity of her diction”:
The bird that I am going to write about is the Owl. The Owl cannot see at all by day and at night is as blind as a bat.
I do not know much about the Owl, so I will go on to the beast I am going to choose. It is the Cow. The Cow is a mammal. It has six sidesright, left, an upper and below. At the back it has a tail on which hangs a brush. With this it sends the ﬂies away so that they do not fall into the milk. The head is for the purpose of growing horns and so that the mouth can be somewhere. The horns are to butt with, and the mouth is to moo with. Under the cow hangs the milk. It is arranged for milking.
When people milk, the milk comes through and there is never any end to the supply. How the cow does it I have not yet realized, but it makes more and more. The cow has a ﬁne sense of smell; one can smell it far away. This is the reason for the fresh air in the country.
The man cow is called an ox. It is not a mammal. The cow does not eat much, but what it eats it eats twice, so that it gets enough. When it is hungry it moos, and when it says nothing it is because its inside is all full up with grass.
In assembling this collection I looked for essays that combined the explanatory depth of the plumber with the limpid prose of the young zoologist. Explanatory depth, surprisingly, is not that easy to ﬁnd. The most common specimen is the science news story. A jour- nalist ﬂips through the contents of Science, Nature, and the New England Journal of Medicine, ﬁnds the article with the weirdest or most alarming or most bite-sized ﬁnding, gets a quote from an author, a supporter, and a critic, and reports that the discovery has overturned everything that scientists had always believed. I understand the pressures that shape this formula: the drama of iconoclasm, the demand by editors for news rather than pedagogy, a desire to show that science is a human activity among spirited antagonists rather than a revelation of the truth by white-coated priests.
But just as presidential campaigns can be distorted by the press’s obsession with minute-by-minute changes in popularity poolls, an understanding of science caaaan be poorly served by news from the front about continual revolution. Conclusions from individual experiments, especially the most surprising ones, are more ephemeral than conclusions from the reviews and syntheses that can’t be squeezed into a brief report in Science. The discovery-du-jour approach can whipsaw readers between contradictory claims of uneven worth or leave them with lasting misimpressions, such as that everything that is pleasurable is deadly for one reason or another.
And contrary to the idea (commonly associated with Karl Popper) that science is a kind of skeet shooting whose goal is to put a bullet through one hypothesis after another, the best science weaves observations into an explanatory narrative. “All the Old Sciences Have Starring Roles” by Chet Raymo (whose weekly science column graced the Boston Globe until his retirement this year) makes the point succinctly. Max Tegmark’s mind-expanding “Parallel Universes” shows, by example and argument, how a powerful theory can not only organize sundry data but also lead to an exhilarating new conception of reality itself. Horace Judson’s “The Stuff of Genes” reﬂects on the far-ﬂung implicationsfor science and lifeof the discovery of the structure of DNA, whose golden jubilee was marked in the year these essays appeared.
Clarity and style, happily, are not in short supply in today’s science writing (though in professional journals their frequency is commensurate with galliformes’ dentition). The genre continues to attract ﬁne writers of all ages, belying the plaint that the younger generation no longer cares about language. Of all the things that go into good science writing, I am fondest of prose that airs out a stuffy hall of scholarship and conveys its insights (or its absurdities) with irreverent wit, like Gregg Easterbrook’s “We’re All Gonna Die!,” Jonathan Rauch’s “Caring for Your Introvert,” Ron Rosenbaum’s “Sex Week at Yale,” and Robert Sapolsky’s “Bugs in the Brain.” But pride of place goes to the Bird Folks (the nom de plume of Mike O’Connor) at the Bird Watcher’s General Store in Orleans, Massachusetts, whose informative weekly column strikes a tone that is opposite to the worshipful sonorities found in much nature writing (parodied by Mark Twain as “Far in the empty sky a solitary esophagus slept upon motionless wing”).
“Ask the Bird Folks” could have seen the light of day only in a quirky rural tabloid, and I think the proliferation of other unconventional outlets will be a boon to unorthodox styles, formats, and, most important, opinions. Conventional wisdom can jell prematurely when a few commentators stake out the cramped real estate in national publications, and I actually believe the old cliché that the Internet is changing intellectual life by providing limitless outlets for unconventional ideas. As it happens, most of the Web pieces on my short list were dropped at the last minute because of various exigencies (wrong year, wrong country, too much overlap). I suspect that more and more of our best science writing will be found on sites like www.edge.org, scitechdaily.com (and its sister site artsandlettersdaily.com), www.spiked-online.com, butter ﬂiesandwheels.com, techcentralstation.com, human-nature.com, and the many blogs by science-oriented journalists.
Science is a human activity, of course, and its rewards are not just discovery and explanation. Most scientists enjoy the mundane activity of gathering their kind of data, and in “Captivated” Meredith Small shares with her readers the pleasures of primatology (while making me understand for the ﬁrst time why primates groom).
And the passionate eccentrics who call themselves scientists are good grist for gossip and character studies, such as Jennet Conant’s proﬁle, which presents yet another consequence of the DNA revolution: the appearance on the scientiﬁc stage of the inimitable James Watson.
Perhaps more than is usual in these collections, my choices are slanted toward human behavior, and their methods shade into the social sciences. In part this reﬂects my own interests in psychology, linguistics, neuroscience, and evolution. It may also show that human interest makes for the most compelling writing. But most of all, it reﬂects the fact that the study of the mind will be among the liveliest frontiers of science in the coming century.
One of these frontiers is the application of genomic analyses to the mind and its products, often in highly unpredictable ways.
Judson alludes to the recent ﬁnding that the normal version of a gene for a speech and language disorder bears the statistical ﬁngerprints of natural selection acting in our lineage after it split off from the lineage leading to chimpanzees. In one stroke this discovery obliterates the suspicion that the evolution of language and mind is permanently beyond the reach of rigorous science. The duo by Nicholas Wade, “In Click Languages, an Echo of the Tongues of the Ancients” and “A Proliﬁc Genghis Khan, It Seems, Helped People the World,” explain two other remarkable applications of genomics to human evolution. One conﬁrms the idea that aggressive polygyny could have affected human evolution by altering our species’ genetic makeup (with a surprise appearance by one of the great villains of history). The other may shed light on what was thought to be forever unknowable: the ﬁrst language spoken by our species.
In anticipating a steady turning of science to the mind and its products I am thinking not just of fancy technologies but of an extension to human affairs of the scientiﬁc mindset itself. This does not mean reducing the human condition to genes or neurons or primate behavior, but rather seeking to ascertain whether a claim about human affairs is consistent with the facts and with everything else we know about how the world works. Today this attitude is far from universal. What would happen if newspapers imposed the following rule: any pundit who comments on a trend and blames it on some factor must adduce evidence that (a) the trend is real, (b) the factor preceded the trend, and (c) that kind of factor causes that kind of trend? On many days the op-ed page would consist of a vast empty space op the ed.
Many of my choices upend some bit of conventional wisdom about human life. In “The Bloody Crossroads of Grammar and Politics,” Geoffrey Nunberg uses a smidgen of linguistics to expose a bit of nonsense about “correct grammar” and the decline of standards that had been latched on to by writers from David Skinner in the Weekly Standard to Louis Menand in The New Yorker. In “Where Have All the Lisas Gone?” Peggy Orenstein shows that trends in baby names are not inspired by the latest celebrities, the popularity of religion, or just about any other external cause. Virginia Postrel’s “The Design of Your Life” presents a sample of the many myths about aesthetics that she dispatches in her 2003 book The Substance of Style, such as the notion that people seek beauty only when their other needs are met, that styles are foisted upon a passive public by manipulative advertisers, and that economic value resides in practical goods and services. Jeffrey Friedman’s “A War on Obesity, Not the Obese” shows that we are not getting as fat as obesity statistics would suggest and that the solution to this health problem does not consist of ﬁnding the right people to blame.
“Sex Week at Yale” shows that being an academic is no protection against holding ludicrous beliefs about human motives, such as the dogmas about love and sex that are common in the humanities and helping professions.
Many misconceptions about behavior are harmless, but in these dangerous times some could lead to catastrophe. Steve Sailer’s “The Cousin Marriage Conundrum” correctly predicts that it would be unwise to try to graft a political system onto a society without understanding how the psychology of kinship and ethnic identi- ﬁcation plays out in the local environment. Scott Atran’s “Genesis of Suicide Terrorism” debunks the bromide, endorsed by impressive lists of Nobel Prize winners and other right-thinking people in countless signed statements, that the root causes of terrorism are poverty and ignorance. The article is no more comforting to those who analyze suicide terrorism only in moralistic terms and insist that terrorists are crazed fanatics or callous psychopaths. Moral outrage is certainly an appropriate response to any slaying of innocents, and it is worth considering the possibility that the retaliation or preemption inspired by outrage is an effective countermeasure.
But moral condemnation is just one technique of behavior modi- ﬁcation, and the fact that it feels right is no guarantee that it will work. If our goal is to minimize innocent deaths, we may have to set aside our moral intuitions long enough to try to understand the behavior in terms of cause and effect, and that means studying the beliefs, desires, and social dynamics of terrorist groups. I suspect that people from all over the political spectrum may be disturbed by Atran’s amoral analysis, but it is a mode of thought that we may have to get used to if we want to improve human affairs.
The interface between science and morals also motivates my remaining choices. Much science journalism today is hostile to scientists in much the same way that much political journalism in the post-Watergate era is hostile to politicians. Scientists are often depicted as arrogant Fausts or cruel Mengeles or greedy proﬁteers.
One article I rejected, for instance, denounced a research program that succeeded in modifying corn to synthesize pharmaceuticals cheaply despite its promise of vast enhancements to human health and a demonstrably trivial risk to the environment. Halos are awarded only to whistleblowers in ecology or climate science who warn us about the wages of our technological lifestyle. In Europe, left-leaning greens call for a Precautionary Principle in which applications of science should be banned or restricted if there is some chance they will have harmful effects, even in the absence of scienti- ﬁc evidence that they do. If the policy, aptly satirized as “Never do anything for the ﬁrst time,” had been applied in the past, it would have ruled out every new technology from ﬁre to fertilizers to malaria control to oral contraception. In the United States, right-leaning bioethicists see research to improve health and well-being as a promethean grab at immortality and a soul-deadening quest to rob us of the nobility of suffering.
This hostility is a big change from the reception that scientists enjoyed a generation ago. When I was a child, my favorite literary genre was the hagiography of a famous scientist; I was taught that Sabin and Salk were the pride of the Jewish people and Banting and Best the pride of Canada. No doubt we are all better off today with a more skeptical treatment of science, but we have swung too far in the direction of timidity about the applications of science and cynicism about the motives of scientists. Austin Bunn’s “The Bittersweet Science” and Atul Gawande’s “Desperate Measures” put a human face on uncured illness and remind us why aggressive medical pioneers were once revered: they lessened pain, inﬁrmity, and needless death, the most noble goal of human striving. There is a story waiting to be told on how the moral coloring of science (and other endeavors) in different periods can be distorted by quirks of the human moral sense (a fertile new research topic in psychology). Our neural circuits for morality are overly receptive to the trappings of purity, naturalness, and custom, and they are too easily impressed by gravitas, indignation, conspicuous asceticism, and other advertisements of saintliness that may have scant correlation with actions that make people better off.
Genetics, neuroscience, and evolutionary biology will call into question other moral intuitions. Reams of nonsense have been written about cloning, genes linked with personality, and pharmaceuticals that may enhance mood, concentration, and memory. Some of the non sequiturs are so bizarre that they make me wonder whether the authors have fully assimilated what Francis Crick calls “the astonishing hypothesis”the idea that all thought and feeling consist of physiological activity in the brainand instead tacitly believe that human choice and individuality reside in an autonomous soul. Philip Boffey’s “Fearing the Worst Should Anyone Produce a Cloned Baby,” Daniel Dennett’s “The Mythical Threat of Genetic Determinism,” and Ronald Bailey’s “The Battle for Your Brain” are breaths of cool thinking in these overheated arenas.
I end with an indulgence. One article that particularly drew me in was, of all things, “Through the Eye of an Octopus.” What could a cognitive scientist ﬁnd so interesting about the secret life of cephalopods? It is not just that the piece reveals an astonishing spectacle in the natural world, and it’s not just that the protagonist is named Steve. The reasons are twofold, and it is not too much of a stretch to say that they illustrate another of my favorite themes in science writing: the interconnectedness of all knowledge, no matter how remote the disciplines.
My ﬁrst reason for liking the article is linguistic. In one of Gary Larson’s Far Side cartoons, a bespectacled octopus at a podium addresses his conspeciﬁcs: “Fellow octopi, or octopuses . . . octopi?
Dang, it’s hard to start a speech with this crowd.” Judging from an Internet search, human scientists also go both ways on this issue.
But Eric Scigliano consistently refers to his subjects as octopuses, and he has the logic of language on his side. The -us in octopus is not the Latin masculine noun ending of alumnus and fungus, which is replaced by -i in the plural. No, it is part of the Greek word pous meaning foot, and turning it into -pi makes no sense. Nor could English have imported the Greek plural as an irregular form, as it did with criterion-criteria and stigma-stigmata, giving us octopodes.
An octopus is the creature that owns the enumerated feet, not the assembly of feet itself. The elegant algorithm that computes the properties of complex words (described in my book Words and Rules) ensures that these synecdochic compounds have regular plurals, even when they are built around irregular nouns. Hence we refer to several members of the extinct family of cats as saber-tooths (not saber-teeth). We similarly talk about lowlifes, still lifes, tenderfoots, ﬂatfoots, and, in The Lord of the Rings, Proudfoots. And by this linguistic logic, we should identify more than one octopus as octopuses.
The other reason I liked the article has to do with human evolution.
It’s lonely to be one of the few species with advanced powers of problem solving, and it’s scientiﬁcally frustrating too. How can we test ideas about the evolution of intelligence if it happened only once? One way is to ﬁnd smarter-than-average species from widely separated branches of the tree of life and see what else distinguishes them from their duller cousins. Studies of other smart creatures like dolphins and wolves suggest that group living is one of the traits that sets the stage for the evolution of higher intelligence.
But this does not explain why humans are so much smarter than other social species. I have always suspected that the ancestral ape that spawned our lineage must have been dealt a number of traits that made higher intelligence worth its metabolic cost. And I speculated in How the Mind Works that one of those traits is the possession of hands. Evolution does not reward cerebration for its own sake but only thoughts that can be put to use in adaptive ways, such as manipulating the world to one’s advantage. If this idea is right, intelligence increased in our ancestors partly because they were equipped with levers of inﬂuence on the world, namely the grippers found at the ends of their two arms. How pleasing to learn that intelligence also evolved in a species that has eight of them.
Copyright © 2004 by Houghton Mifflin. Introduction copyright © 2004 by Steven Pinker. Reprinted by permission of Houghton Mifflin Company.
and post it to your social network
Most Helpful Customer Reviews
See all customer reviews >