The Information: A History, a Theory, a Floodby James Gleick
James Gleick, the author of the best sellers Chaos and Genius, now brings us a work just as astonishing and masterly: a revelatory chronicle and meditation that shows how information has become the modern era’s defining quality—the blood, the fuel, the vital principle of our world.
The story of information begins in a time/i>/i>
- Checkmark Reading Your Mind Shop Now
James Gleick, the author of the best sellers Chaos and Genius, now brings us a work just as astonishing and masterly: a revelatory chronicle and meditation that shows how information has become the modern era’s defining quality—the blood, the fuel, the vital principle of our world.
The story of information begins in a time profoundly unlike our own, when every thought and utterance vanishes as soon as it is born. From the invention of scripts and alphabets to the long-misunderstood talking drums of Africa, Gleick tells the story of information technologies that changed the very nature of human consciousness. He provides portraits of the key figures contributing to the inexorable development of our modern understanding of information: Charles Babbage, the idiosyncratic inventor of the first great mechanical computer; Ada Byron, the brilliant and doomed daughter of the poet, who became the first true programmer; pivotal figures like Samuel Morse and Alan Turing; and Claude Shannon, the creator of information theory itself.
And then the information age arrives. Citizens of this world become experts willy-nilly: aficionados of bits and bytes. And we sometimes feel we are drowning, swept by a deluge of signs and signals, news and images, blogs and tweets. The Information is the story of how we got here and where we are heading.
From the Hardcover edition.
The New York Times
The New York Times
The Washington Post
“So ambitious, illuminating and sexily theoretical that it will amount to aspirational reading for many of those who have the mettle to tackle it…The Information is to the nature, history and significance of data what the beach is to sand.” –New York Times
“[A] tour de force…This is intellectual history of tremendous verve, insight, and significance. Unfailingly spirited, often poetic, Gleick recharges our astonishment over the complexity and resonance of the digital sphere and ponders our hunger for connectedness…Destined to be a science classic, best-seller Gleick’s dynamic history of information will be one of the biggest nonfiction books of the year.” –Booklist, starred review
“With his brilliant ability to synthesize mounds of details and to tell rich stories, Gleick leads us on a journey from one form of communication information to another…Gleick’s exceptional history of culture concludes that information is indeed the blood, the fuel, and the vital principle on which our world runs.” –Publishers Weekly, starred review
“Rich and fascinating.” –Washington Post
"No author is better equipped for such a wide- ranging tour than Mr. Gleick. Some writers excel at crafting a historical narrative, others at elucidating esoteric theories, still others at humanizing scientists. Mr. Gleick is a master of all these skills." –Wall Street Journal
“Gleick presses rousing tales from the history of human communication into the service of one Very Big Idea…he does what only the best science writers can: take a subject of which most of us are only peripherally aware and put it at the center of the universe.” –Time
“A wide-ranging, deeply researched and delightfully engaging history…” –Los Angeles Times
“The gifted science writer James Gleick explains how we’ve progressed from seeing information as the expression of human thought and emotion to looking at it as a commodity that can be processed, like wheat or plutonium. It’s a long, complicated, and important story, and in Gleick’s hands it’s also a mesmerizing one…As a celebration of human ingenuity, The Information is a deeply hopeful book.” –Nicholas Carr, Daily Beast
“A grand narrative if ever there was one…Gleick provides lucid expositions for readers who are up to following the science and suggestive analogies for those who are just reading for the plot. And there are anecdotes that every reader can enjoy…A prodigious intellectual survey.” –New York Times Book Review
“A highly ambitious and generally brilliant effort to tie together centuries of disparate scientific efforts to understand information as a meaningful concept…By the close of the book you cannot think of information as you might have before. It has become, quite palpably, something different than almost anything we encounter: resistant to decay and capable of perfect self-reproduction. It outlasts the organic beings who create it, and, by replication, the inorganic mediums used to store it. The Information—not unlike other science books that tackle big human quests for understanding—at times bears more than a passing resemblance to a spiritual text.” –Slate
“When collected together in this coherent historical narrative, [his observations] do feel ‘revelatory,’ as his publisher claims…Gleick is wrestling with truly profound material, and so will the reader. This is not a book you will race through on a single plane trip. It is a slow, satisfying meal.” –Columbia Journalism Review
"Accessible and engrossing."
“The author’s skills as an interpreter of science shine…for completist cybergeeks and infojunkies, the book delivers a solid summary of a dense, complex subject.”
“Extraordinary in its sweep…Gleick’s story is beautifully told, extensively sourced, and continually surprising.” –Brainiac, Boston Globe online
“Entertaining, funny and clever.” –New Scientist
“A brilliant, panoramic view of how we save and communicate knowledge...and provides thrilling portraits of the geniuses behind the inventions. Provocative and illuminating.” –People
“A commanding chronicle of the information revolution…tantalizing.” –philly.com
“An ambitious, sprawling work.” –Kirkus
“Absorbing…The Information is lyrical, patient, impeccably researched, and full of interesting digressions.” –The Boston Globe
“Tremendously enjoyable. Gleick has an eye and ear for the catchy detail and observation…offers a broad and fascinating foundation, impressive in its reach. A very good read, certainly recommended.” –The Complete Review
“A powerful and rigorous and at times very moving history of information…You can dip into The Information at just about any point and emerge with a magnificent detail.” –Time, Top 10 of Everything 2011
“Heady…This intellectual history is intoxicating—thanks to Gleick’s clear mind, magpie-styled research and explanatory verve.” –Cleveland.com
“To write a history of information…it’s beyond ambitious—it’s audacious. But James Gleick pulls it off…a gracefully written book.” –USA Today
“A book about everything…Gleick sees the world as an endlessly unfolding opportunity in which ‘creatures of the information’ might just recognize themselves.” –Shelfari
“An interesting and detailed history of how we’ve moved from an alphabet to words, writing, dictionaries, etc.” –Alpha
“Imaginatively conceived and staggeringly researched…a transformative work.” –The Phoenix.com
“[Gleick] remains a gifted writer with a passion for a subject that would easily drown many of us.” –About.com
“Expertly draws out neglected names and stories from history…Gleick’s skill as an expicator of counterintuitive concepts makes the chapters on logic, the stuff even most philosophy majors slept through in class, brim with tension.” –Oregonlive.com
“This is the page-turner you never knew you desperately wanted to read.” –The Stranger Slog
“This is an amazing book. If you have any designs on being a professor of information, you should read this book slowly and thoughtfully.” –ALA TechSource
“The most ambitious, compelling, insert-word-of-intellectual-awe-here book to read this year.” –The Atlantic
“Wide-ranging and fascinating.” –New York Journal of Books
“The most comprehensive book written, to date, about information. An amazing erudite and yet highly readable account…amongst the most profound books written about technology.” –Tech Crunch, TCTV
“Very fascinating…It will make readers see the world more intelligently than before. Essential.” –Choice, Current Reviews for Academic Libraries
“In his fascinating new history of the rise and the breadth of today’s communication age, Gleick sheds light on the many ways we impart and receive information.” –New York Times Styles
“Gleick is one of the great science writers of our age…The Information is an entertaining and instructive romp through the history of information technologies…for anyone interested in learning more about the important and ever-more-prominent role that information plays in our society, the book is not only a pleasure to read, it is well worth reading.” –American Scientist
“Grand, lucid and awe-inspiring…information is about a lot more than what human beings have to say to each other. It’s the very stuff of reality, and never have its mysteries been offered up with more elegance or aplomb.” –Salon.com best of 2011
“Magnificent…this elegant, insightful study reminds us that we have always been adrift in an incomprehensible universe.” –LA Times best books of 2011
“The Information is lyrical, patient, impeccably researched, and jammed with interesting—um, well—information.” –Boston Globe Best of 2011
“A sprawling yet fascinating book by an acclaimed American science writer, The Information ranges from biology to particle physics and explores the links between information, communications, data and meaning from earliest times to the present day.” –The Economist
Praise for James Gleick’s
“An awe-inspiring book. Reading it gave me the sensation that someone had just found the light switch.”
“Enthralling. Full of beautifully strange and strangely beautiful ideas.”
“The clearest statement I have seen of the true spirit of science.”
—Freeman J. Dyson
“A brilliant and engaging study in the paradoxes of the scientific imagination.”
Think your inbox is jammed now, your attention span overtaxed? It's only the beginning, writes pop-science writer Gleick (Isaac Newton, 2003, etc.) in this tour of information and the theory that goes along with it.
It has been a long progression toward the infoglut of today. The author chooses as a logical if unanticipated starting point the talking drums of Africa, an information technology that delivers a satisfying amount of signal in all the noise. From those drums to Morse code, and indeed to binary signaling, is a pretty short hop—and one that Gleick takes, writing along the way about such things as how Samuel Morse and his partner decided which letters were the most used in English, and therefore merited the shortest sequences of dot and dash. The author tours through the earliest information technologies—the intaglio scratches of stone and bone on prehistoric caves, the emergent ideographs of the first Chinese scripts and so on—before getting into the meatier mathematics of more recent times, which led Charles Babbage, say, to ponder the workings of the first oh-so-clunky computers. As Gleick writes, Babbage surrounded himself with fellow science nerds who agreed to write and send scientific papers to one another every six months, though if a member were delayed by a year, "it shall be taken for granted that his relatives had shut him up as insane." The discussion becomes more complex with the intersection of modern physics. In the emergence of Claude Shannon and Alan Turing's first stirrings of modern information theory, the author's skills as an interpreter of science shine. None of his discussion will be news to readers of Tim Wu's exemplaryThe Master Switch(2010) or of the oldCoevolution Quarterly, but Gleick covers the ground in a way that no other book quite manages to do.
Gleick loves the layered detail, which might cause some to sigh, "TMI." But for completist cybergeeks and infojunkies, the book delivers a solid summary of a dense, complex subject.
- Knopf Doubleday Publishing Group
- Publication date:
- Sold by:
- Random House
- NOOK Book
- Sales rank:
- File size:
- 6 MB
Read an Excerpt
[Due to website constraints, we are unable to include any images that might have been included in the original text.]
Into the Meme Pool
(You Parasitize My Brain)
When I muse about memes, I often ﬁnd myself picturing an ephemeral ﬂickering pattern of sparks leaping from brain to brain, screaming “Me, me!”
—Douglas Hofstadter (1983)
“Now through the very universality of its structures, starting with the code, the biosphere looks like the product of a unique event,” Jacques Monod wrote in 1970. “The universe was not pregnant with life, nor the biosphere with man. Our number came up in the Monte Carlo game. Is it any wonder if, like a person who has just made a million at the casino, we feel a little strange and a little unreal?”
Monod, the Parisian biologist who shared the Nobel Prize for working out the role of messenger RNA in the transfer of genetic information, was not alone in thinking of the biosphere as more than a notional place: an entity, composed of all the earth’s life-forms, simple and complex, teeming with information, replicating and evolving, coding from one level of abstraction to the next. This view of life was more abstract—more mathematical—than anything Darwin had imagined, but he would have recognized its basic principles. Natural selection directs the whole show. Now biologists, having absorbed the methods and vocabulary of communications science, went further to make their own contributions to the understanding of information itself. Monod proposed an analogy: Just as the biosphere stands above the world of nonliving matter, so an “abstract kingdom” rises above the biosphere. The denizens of this kingdom? Ideas.
Ideas have retained some of the properties of organisms. Like them, they tend to perpetuate their structure and to breed; they too can fuse, recombine, segregate their content; indeed they too can evolve, and in this evolution selection must surely play an important role.
Ideas have “spreading power,” he noted—“infectivity, as it were”—and some more than others. An example of an infectious idea might be a religious ideology that gains sway over a large group of people. The American neurophysiologist Roger Sperry had put forward a similar notion several years earlier, arguing that ideas are “just as real” as the neurons they inhabit. Ideas have power, he said.
Ideas cause ideas and help evolve new ideas. They interact with each other and with other mental forces in the same brain, in neighboring brains, and thanks to global communication, in far distant, foreign brains. And they also interact with the external surroundings to produce in toto a burstwise advance in evolution that is far beyond anything to hit the evolutionary scene yet. . . .
I shall not hazard a theory of the selection of ideas.
No need. Others were willing.
Richard Dawkins made his own connection between the evolution of genes and the evolution of ideas. His essential actor was the replicator, and it scarcely mattered whether replicators were made of nucleic acid. His rule is “All life evolves by the differential survival of replicating entities.” Wherever there is life, there must be replicators. Perhaps on other worlds replicators could arise in a silicon-based chemistry—or in no chemistry at all.
What would it mean for a replicator to exist without chemistry? “I think that a new kind of replicator has recently emerged on this planet,” he proclaimed at the end of his ﬁrst book, in 1976. “It is staring us in the face. It is still in its infancy, still drifting clumsily about in its primeval soup, but already it is achieving evolutionary change at a rate that leaves the old gene panting far behind.” That “soup” is human culture; the vector of transmission is language; and the spawning ground is the brain.
For this bodiless replicator itself, Dawkins proposed a name. He called it the meme, and it became his most memorable invention, far more inﬂuential than his selﬁsh genes or his later proselytizing against religiosity. “Memes propagate themselves in the meme pool by leaping from brain to brain via a process which, in the broad sense, can be called imitation,” he wrote. They compete with one another for limited resources: brain time or bandwidth. They compete most of all for attention. For example:
Ideas. Whether an idea arises uniquely or reappears many times, it may thrive in the meme pool or it may dwindle and vanish. The belief in God is an example Dawkins offers—an ancient idea, replicating itself not just in words but in music and art. The belief that the earth orbits the sun is no less a meme, competing with others for survival. (Truth may be a helpful quality for a meme, but it is only one among many.)
Tunes. This tune
has spread for centuries across several continents. This one
a notorious though shorter-lived invader of brains, overran an immense population many times faster.
Catchphrases. One text snippet, “What hath God wrought?” appeared early and spread rapidly in more than one medium. Another, “Read my lips,” charted a peculiar path through late twentieth-century America. “Survival of the ﬁttest” is a meme that, like other memes, mutates wildly (“survival of the fattest”; “survival of the sickest”; “survival of the fakest”; “survival of the twittest”; . . . ).
Images. In Isaac Newton’s lifetime, no more than a few thousand people had any idea what he looked like, though he was one of England’s most famous men, yet now millions of people have quite a clear idea— based on replicas of copies of rather poorly painted portraits. Even more pervasive and indelible are the smile of Mona Lisa, The Scream of Edvard Munch, and the silhouettes of various ﬁctional extraterrestrials. These are memes, living a life of their own, independent of any physical reality. “This may not be what George Washington looked like then,” a tour guide was overheard saying of the Gilbert Stuart painting at the Metropolitan Museum of Art, “but this is what he looks like now.” Exactly.
Memes emerge in brains and travel outward, establishing beachheads on paper and celluloid and silicon and anywhere else information can go. They are not to be thought of as elementary particles but as organisms. The number three is not a meme; nor is the color blue, nor any simple thought, any more than a single nucleotide can be a gene. Memes are complex units, distinct and memorable—units with staying power. Also, an object is not a meme. The hula hoop is not a meme; it is made of plastic, not of bits. When this species of toy spread worldwide in a mad epidemic in 1958, it was the product, the physical manifestation of a meme, or memes: the craving for hula hoops; the swaying, swinging, twirling skill set of hula-hooping. The hula hoop itself is a meme vehicle. So, for that matter, is each human hula hooper—a strikingly effective meme vehicle, in the sense neatly explained by the philosopher Daniel Dennett: “A wagon with spoked wheels carries not only grain or freight from place to place; it carries the brilliant idea of a wagon with spoked wheels from mind to mind.” Hula hoopers did that for the hula hoop’s memes—and in 1958 they found a new transmission vector, broadcast television, sending its messages immeasurably faster and farther than any wagon. The moving image of the hula hooper seduced new minds by hundreds, and then by thousands, and then by millions. The meme is not the dancer but the dance.
We are their vehicles and their enablers. For most of our biological history they existed ﬂeetingly; their main mode of transmission was the one called “word of mouth.” Lately, however, they have managed to adhere in solid substance: clay tablets, cave walls, paper sheets. They achieve longevity through our pens and printing presses, magnetic tapes and optical disks. They spread via broadcast towers and digital networks. Memes may be stories, recipes, skills, legends, and fashions. We copy them, one person at a time. Alternatively, in Dawkins’s meme-centered perspective, they copy themselves. At ﬁrst some of Dawkins’s readers wondered how literally to take that. Did he mean to give memes anthropomorphic desires, intentions, and goals? It was the selﬁsh gene all over again. (Typical salvo: “Genes cannot be selﬁsh or unselﬁsh, any more than atoms can be jealous, elephants abstract or biscuits teleological.” Typical rebuttal: a reminder that selﬁshness is deﬁned by the geneticist as the tendency to increase one’s chances of survival relative to its competitors.)
Dawkins’s way of speaking was not meant to suggest that memes are conscious actors, only that they are entities with interests that can be furthered by natural selection. Their interests are not our interests. “A meme,” Dennett says, “is an information packet with attitude.” When we speak of ﬁghting for a principle or dying for an idea, we may be more literal than we know. “To die for an idea; it is unquestionably noble,”
H. L. Mencken wrote. “But how much nobler it would be if men died for ideas that were true!”
Tinker, tailor, soldier, sailor . . . Rhyme and rhythm help people remember bits of text. Or: rhyme and rhythm help bits of text get remembered. Rhyme and rhythm are qualities that aid a meme’s survival, just as strength and speed aid an animal’s. Patterned language has an evolutionary advantage. Rhyme, rhythm, and reason—for reason, too, is a form of pattern. I was promised on a time to have reason for my rhyme; from that time unto this season, I received nor rhyme nor reason.
Like genes, memes have effects on the wide world beyond themselves: phenotypic effects. In some cases (the meme for making ﬁre; for wearing clothes; for the resurrection of Jesus) the effects can be powerful indeed. As they broadcast their inﬂuence on the world, memes thus inﬂuence the conditions affecting their own chances of survival. The meme or memes composing Morse code had strong positive feedback effects. “I believe that, given the right conditions, replicators automatically band together to create systems, or machines, that carry them around and work to favour their continued replication,” wrote Dawkins. Some memes have evident beneﬁts for their human hosts (“look before you leap,” knowledge of CPR, belief in hand washing before cooking), but memetic success and genetic success are not the same. Memes can replicate with impressive virulence while leaving swaths of collateral damage—patent medicines and psychic surgery, astrology and satanism, racist myths, superstitions, and (a special case) computer viruses. In a way, these are the most interesting—the memes that thrive to their hosts’ detriment, such as the idea that suicide bombers will ﬁnd their reward in heaven.
When Dawkins ﬁrst ﬂoated the meme meme, Nicholas Humphrey, an evolutionary psychologist, said immediately that these entities should be considered “living structures, not just metaphorically but technically”:
When you plant a fertile meme in my mind you literally parasitize my brain, turning it into a vehicle for the meme’s propagation in just the way that a virus may parasitize the genetic mechanism of a host cell. And this isn’t just a way of talking—the meme for, say, “belief in life after death” is actually realized physically, millions of times over, as a structure in the nervous systems of individual men the world over.
Most early readers of The Selﬁsh Gene passed over memes as a fanciful afterthought, but the pioneering ethologist W. D. Hamilton, reviewing the book for Science, ventured this prediction:
Hard as this term may be to delimit—it surely must be harder than gene, which is bad enough—I suspect that it will soon be in common use by biologists and, one hopes, by philosophers, linguists, and others as well and that it may become absorbed as far as the word “gene” has been into everyday speech.
Memes could travel wordlessly even before language was born. Plain mimicry is enough to replicate knowledge—how to chip an arrowhead or start a ﬁre. Among animals, chimpanzees and gorillas are known to acquire behaviors by imitation. Some species of songbirds learn their songs, or at least song variants, after hearing them from neighboring birds (or, more recently, from ornithologists with audio players). Birds develop song repertoires and song dialects—in short, they exhibit a bird-song culture that predates human culture by eons. These special cases notwithstanding, for most of human history memes and language have gone hand in glove. (Clichés are memes.) Language serves as culture’s ﬁrst catalyst. It supersedes mere imitation, spreading knowledge by abstraction and encoding.
Perhaps the analogy with disease was inevitable. Before anyone understood anything of epidemiology, its language was applied to species of information. An emotion can be infectious, a tune catchy, a habit contagious. “From look to look, contagious through the crowd / The panic runs,” wrote the poet James Thomson in 1730. Lust, likewise, according to Milton: “Eve, whose eye darted contagious ﬁre.” But only in the new millennium, in the time of global electronic transmission, has the identiﬁcation become second nature. Ours is the age of virality: viral education, viral marketing, viral e-mail and video and networking. Researchers studying the Internet itself as a medium—crowdsourcing, collective attention, social networking, and resource allocation—employ not only the language but also the mathematical principles of epidemiology.
One of the ﬁrst to use the terms viral text and viral sentences seems to have been a reader of Dawkins named Stephen Walton of New York City, corresponding in 1981 with Douglas Hofstadter. Thinking logically—perhaps in the mode of a computer—Walton proposed simple self-replicating sentences along the lines of “Say me!” “Copy me!” and “If you copy me, I’ll grant you three wishes!” Hofstadter, then a columnist for Scientiﬁc American, found the term viral text itself to be even catchier.
Well, now, Walton’s own viral text, as you can see here before your eyes, has managed to commandeer the facilities of a very powerful host—an entire magazine and printing press and distribution service. It has leapt aboard and is now—even as you read this viral sentence—propagating itself madly throughout the ideosphere!
(In the early 1980s, a magazine with a print circulation of 700,000 still seemed like a powerful communications platform.) Hofstadter gaily declared himself infected by the meme meme.
One source of resistance—or at least unease—was the shoving of us humans toward the wings. It was bad enough to say that a person is merely a gene’s way of making more genes. Now humans are to be considered as vehicles for the propagation of memes, too. No one likes to be called a puppet. Dennett summed up the problem this way: “I don’t know about you, but I am not initially attracted by the idea of my brain as a sort of dung heap in which the larvae of other people’s ideas renew themselves, before sending out copies of themselves in an informational diaspora. . . . Who’s in charge, according to this vision—we or our memes?”
He answered his own question by reminding us that, like it or not, we are seldom “in charge” of our own minds. He might have quoted Freud; instead he quoted Mozart (or so he thought):
In the night when I cannot sleep, thoughts crowd into my mind. . . .
Whence and how do they come? I do not know and I have nothing to do
with it. Those which please me I keep in my head and hum them.
Later Dennett was informed that this well-known quotation was not Mozart’s after all. It had taken on a life of its own; it was a fairly successful meme.
For anyone taken with the idea of memes, the landscape was changing faster than Dawkins had imagined possible in 1976, when he wrote, “The computers in which memes live are human brains.” By 1989, the time of the second edition of The Selﬁsh Gene, having become an adept programmer himself, he had to amend that: “It was obviously predictable that manufactured electronic computers, too, would eventually play host to self-replicating patterns of information.” Information was passing from one computer to another “when their owners pass ﬂoppy discs around,” and he could see another phenomenon on the near horizon: computers connected in networks. “Many of them,” he wrote, “are literally wired up together in electronic mail exchange. . . . It is a perfect milieu for self-replicating programs to ﬂourish.” Indeed, the Internet was in its birth throes. Not only did it provide memes with a nutrient-rich culture medium; it also gave wings to the idea of memes. Meme itself quickly became an Internet buzzword. Awareness of memes fostered their spread.
A notorious example of a meme that could not have emerged in pre-Internet culture was the phrase “jumped the shark.” Loopy self-reference characterized every phase of its existence. To jump the shark means to pass a peak of quality or popularity and begin an irreversible decline. The phrase was thought to have been used ﬁrst in 1985 by a college student named Sean J. Connolly, in reference to a certain television series. The origin of the phrase requires a certain amount of explanation without which it could not have been initially understood. Perhaps for that reason, there is no recorded usage until 1997, when Connolly’s roommate, Jon Hein, registered the domain name jumptheshark.com and created a web site devoted to its promotion. The web site soon featured a list of frequently asked questions:
Q. Did “jump the shark” originate from this web site, or did you create the site to capitalize on the phrase?
A. This site went up December 24, 1997 and gave birth to the phrase “jump the shark.” As the site continues to grow in popularity, the term has become more commonplace. The site is the chicken, the egg, and now a Catch-22.
It spread to more traditional media in the next year; Maureen Dowd devoted a column to explaining it in The New York Times in 2001; in 2003 the same newspaper’s “On Language” columnist, William Saﬁre, called it “the popular culture’s phrase of the year”; soon after that, people were using the phrase in speech and in print without self-consciousness— no quotation marks or explanation—and eventually, inevitably, various cultural observers asked, “Has ‘jump the shark’ jumped the shark?” (“Granted, Jump the Shark is a brilliant cultural concept. . . . But now the damn thing is everywhere.”) Like any good meme, it spawned mutations. The “jumping the shark” entry in Wikipedia advised in 2009, “See also: jumping the couch; nuking the fridge.”
Is this science? In his 1983 column, Hofstadter proposed the obvious memetic label for such a discipline: memetics. The study of memes has attracted researchers from ﬁelds as far apart as computer science and microbiology. In bioinformatics, chain letters are an object of study. They are memes; they have evolutionary histories. The very purpose of a chain letter is replication; whatever else a chain letter may say, it embodies one message: Copy me. One student of chain-letter evolution, Daniel W. VanArsdale, listed many variants, in chain letters and even earlier texts: “Make seven copies of it exactly as it is written” ; “Copy this in full and send to nine friends” ; “And if any man shall take away from the words of the book of this prophecy, God shall take away his part out of the book of life” [Revelation 22:19]. Chain letters ﬂourished with the help of a new nineteenth-century technology: “carbonic paper,” sandwiched between sheets of writing paper in stacks. Then carbon paper made a symbiotic partnership with another technology, the typewriter. Viral outbreaks of chain letters occurred all through the early twentieth century.
“An unusual chain-letter reached Quincy during the latter part of 1933,” wrote a local Illinois historian. “So rapidly did the chain-letter fad develop symptoms of mass hysteria and spread throughout the United States, that by 1935–1936 the Post Ofﬁce Department, as well as agencies of public opinion, had to take a hand in suppressing the movement.” He provided a sample—a meme motivating its human carriers with promises and threats:
We trust in God. He supplies our needs.
Mrs. F. Streuzel . . . . . . . .Mich.
Mrs. A. Ford . . . . . . . . . .Chicago, Ill.
Mrs. K. Adkins . . . . . . . . Chicago, Ill. etc.
Copy the above names, omitting the ﬁrst. Add your name last. Mail it to ﬁve persons who you wish prosperity to. The chain was started by an American Colonel and must be mailed 24 hours after receiving it. This will bring prosperity within 9 days after mailing it.
Mrs. Sanford won $3,000. Mrs. Andres won $1,000.
Mrs. Howe who broke the chain lost everything she possessed.
The chain grows a deﬁnite power over the expected word.
DO NOT BREAK THE CHAIN.
Two subsequent technologies, when their use became widespread, provided orders-of-magnitude boosts in chain-letter fecundity: photocopying (c. 1950) and e-mail (c. 1995). One team of information scientists—Charles H. Bennett from IBM in New York and Ming Li and Bin Ma from Ontario, Canada—inspired by a chance conversation on a hike in the Hong Kong mountains, began an analysis of a set of chain letters collected during the photocopier era. They had thirty-three, all variants of a single letter, with mutations in the form of misspellings, omissions, and transposed words and phrases. “These letters have passed from host to host, mutating and evolving,” they reported.
Like a gene, their average length is about 2,000 characters. Like a potent virus, the letter threatens to kill you and induces you to pass it on to your “friends and associates”—some variation of this letter has probably reached millions of people. Like an inheritable trait, it promises beneﬁts for you and the people you pass it on to. Like genomes, chain letters undergo natural selection and sometimes parts even get transferred between coexisting “species.”
Reaching beyond these appealing metaphors, they set out to use the letters as a “test bed” for algorithms used in evolutionary biology. The algorithms were designed to take the genomes of various modern creatures and work backward, by inference and deduction, to reconstruct their phylogeny—their evolutionary trees. If these mathematical methods worked with genes, the scientists suggested, they should work with chain letters, too. In both cases the researchers were able to verify mutation rates and relatedness measures.
Still, most of the elements of culture change and blur too easily to qualify as stable replicators. They are rarely as neatly ﬁxed as a sequence of DNA. Dawkins himself emphasized that he had never imagined founding anything like a new science of memetics. A peer-reviewed Journal of Memetics came to life in 1997—published online, naturally—and then faded away after eight years partly spent in self-conscious debate over status, mission, and terminology. Even compared with genes, memes are hard to mathematize or even to deﬁne rigorously. So the gene-meme analogy causes uneasiness and the genetics-memetics analogy even more.
Genes at least have a grounding in physical substance. Memes are abstract, intangible, and unmeasurable. Genes replicate with near-perfect ﬁdelity, and evolution depends on that: some variation is essential, but mutations need to be rare. Memes are seldom copied exactly; their boundaries are always fuzzy, and they mutate with a wild ﬂexibility that would be fatal in biology. The term meme could be applied to a suspicious cornucopia of entities, from small to large. For Dennett, the ﬁrst four notes of Beethoven’s Fifth Symphony were “clearly” a meme, along with Homer’s Odyssey (or at least the idea of the Odyssey), the wheel, anti-Semitism, and writing. “Memes have not yet found their Watson and Crick,” said Dawkins; “they even lack their Mendel.”
Yet here they are. As the arc of information ﬂow bends toward ever greater connectivity, memes evolve faster and spread farther. Their presence is felt if not seen in herd behavior, bank runs, informational cascades, and ﬁnancial bubbles. Diets rise and fall in popularity, their very names becoming catchphrases—the South Beach Diet and the Atkins Diet, the Scarsdale Diet, the Cookie Diet and the Drinking Man’s Diet all replicating according to a dynamic about which the science of nutrition has nothing to say. Medical practice, too, experiences “surgical fads” and “iatroepidemics”—epidemics caused by fashions in treatment—like the iatroepidemic of children’s tonsillectomies that swept the United States and parts of Europe in the mid-twentieth century, with no more medical beneﬁt than ritual circumcision. Memes were seen through car windows when yellow diamond-shaped baby on board signs appeared as if in an instant of mass panic in 1984, in the United States and then Europe and Japan, followed an instant later by a spawn of ironic mutations (baby i’m board, ex in trunk). Memes were felt when global discourse was dominated in the last year of the millennium by the belief that the world’s computers would stammer or choke when their internal clocks reached a special round number.
In the competition for space in our brains and in the culture, the effective combatants are the messages. The new, oblique, looping views of genes and memes have enriched us. They give us paradoxes to write on Möbius strips. “The human world is made of stories, not people,” writes David Mitchell. “The people the stories use to tell themselves are not to be blamed.” Margaret Atwood writes: “As with all knowledge, once you knew it, you couldn’t imagine how it was that you hadn’t known it before. Like stage magic, knowledge before you knew it took place before your very eyes, but you were looking elsewhere.” Nearing death, John Updike reﬂects on
A life poured into words—apparent waste
intended to preserve the thing consumed.
Fred Dretske, a philosopher of mind and knowledge, wrote in 1981: “In the beginning there was information. The word came later.” He added this explanation: “The transition was achieved by the development of organisms with the capacity for selectively exploiting this information in order to survive and perpetuate their kind.” Now we might add, thanks to Dawkins, that the transition was achieved by the information itself, surviving and perpetuating its kind and selectively exploiting organisms.
Most of the biosphere cannot see the infosphere; it is invisible, a parallel universe humming with ghostly inhabitants. But they are not ghosts to us—not anymore. We humans, alone among the earth’s organic creatures, live in both worlds at once. It is as though, having long coexisted with the unseen, we have begun to develop the needed extrasensory perception. We are aware of the many species of information. We name their types sardonically, as though to reassure ourselves that we understand: urban myths and zombie lies. We keep them alive in air-conditioned server farms. But we cannot own them. When a jingle lingers in our ears, or a fad turns fashion upside down, or a hoax dominates the global chatter for months and vanishes as swiftly as it came, who is master and who is slave?
From the Hardcover edition.
and post it to your social network
Most Helpful Customer Reviews
See all customer reviews >