In American Unreason Ms. Jacoby, the author of earlier books like Freethinkers: A History of American Secularism, proposes to anatomize this dismaying phenomenon, while situating it in historical context. Her book is smart, well researched and frequently cogentparticularly in looking at the causes of American anti-intellectualism, past and present…
The New York Times
Inspired by Richard Hofstadter's trenchant 1963 cultural analysis Anti-Intellectualism in American Life, Jacoby (Freethinkers: A History of American Secularism) has produced an engaging, updated and meticulously thought-out continuation of her academic idol's research. Dismayed by the average U.S. citizen's political and social apathy and the overall "crisis of memory and knowledge involving everything about the way we learn and think," Jacoby passionately argues that the nation's current cult of unreason has deadly and destructive consequences (the war in Iraq, for one) and traces the seeds of current anti-intellectualism (and its partner in crime, antirationalism) back to post-WWII society. Unafraid of pointing fingers, she singles out mass media and the resurgence of fundamentalist religion as the primary "vectors" of anti-intellectualism, while also having harsh words for pseudoscientists. Through historical research, Jacoby breaks down popular beliefs that the 1950s were a cultural wasteland and the 1960s were solely a breeding ground for liberals. Though sometimes partial to inflated prose ("America's endemic anti-intellectual tendencies have been grievously exacerbated by a new species of semiconscious anti-rationalism"), Jacoby has assembled an erudite mix of personal anecdotes, cultural history and social commentary to decry America's retreat into "junk thought." (Feb. 12)Copyright 2007 Reed Business Information
The decline of American civilization has been a favorite subject for writers throughout the last half century. Their screeds usually follow one of two models: the conservative (of which Allan Bloom's The Closing of the American Mind is the most notable example), which blames the mindlessness of modern culture on the '60s, political correctness, and the hijacking of the universities by radical feminists and multiculturalists; and the liberal (with Richard Hofstadter's 1963 Anti-Intellectualism in American Life as prototype), which tends to point a finger at religious fundamentalism, ignorance, racism, and anti-Darwinist school boards. The two sides have always been united, however, in their distrust of television and the electronic media and their belief that these technologies are rendering us ever dumb and dumber. In the words of journalist and social historian Susan Jacoby, "The media, while they may not actually be the message, inevitably reshape content to fit a form that subordinates both the spoken and the written word to visual images"; she expresses a heartfelt disgust for our current way of life, which ensures that we all spend our time "sucking at the video tit from cradle to grave."
The premise of Jacoby's The Age of American Unreason is summed up in this sentence: "During the past four decades, America's endemic anti-intellectual tendencies have been grievously intensified by a new species of semiconscious anti-rationalism, feeding on and fed by an ignorant popular culture of video images and unremitting noise that leaves no room for contemplation or logic." There is nothing particularly original in this; indeed, Jacoby is only adding her voice to a voluble chorus that includes Richard Dawkins, Neil Postman, Sam Harris, Todd Gitlin, and Al Gore (whose recent book The Assault on Reason has much in common with Jacoby's study). But Jacoby, who proved herself a competent cultural historian with Freethinkers: A History of American Secularism, has placed her material in a historical context: why, she asks, is America so peculiarly vulnerable to anti-rationalist rhetoric? Why has America and America alone, among developed nations of the Western world, made a backward swing toward fundamentalist theology and the rejection of scientific consensus? Why do we seem so very open to junk science and pseudoscience?
Part of the problem is obvious: we are scientifically ignorant, mostly because our public schools have failed us. But why have they failed us, and why have we allowed them to do so? Jacoby's explanation is both cogent and persuasive: progress has been arrested because of the triumph of local school control over the ideal of national standards, thanks to "the vastness of the continent, the Constitution's deference to states' rights, and the jealous maintenance of local prerogatives within states." "Most politicians in the founding generation," Jacoby reminds us, "were opposed to all general taxation for education, including at the state and local levels."
That battle was eventually won, but taxation for schools was to remain local rather than national. "In Europe, the subject matter of science and history lessons taught to children in all publicly supported schools has always been determined by highly educated employees of central education ministries. In America, the image of an educated elite laying down national guidelines for schools was and is a b?te noire...." And the situation has been exacerbated by long-standing American prejudices and clichés: our blind idealization of the autodidact and the self-made man; the "popular equation of intellectualism with a liberalism supposedly at odds with traditional American values"; "a chronic suspicion of experts that dovetails with the folk belief in the superior wisdom of ordinary people"; and the idiotic mantra "Those who can, do; those who can't, teach."
Jacoby's analysis of the historical roots of American anti-rationalism is intelligent and well argued. When she turns to the present, however, her broad generalizations about culture and education are distinctly less plausible. Few, I think, would disagree with her claim that video has debased political discourse, since "the only kind of politics that does not lend itself to video images is any political appeal to thoughtfulness, reason, and logic" -- and that it has equally coarsened religious discourse, as religion comes across most powerfully on video when it "makes no attempt to appeal to anything but emotion, and leaves no room for doubt." But with Jacoby's passionate critique of the Internet and the blogging culture she herself eschews rational argument and falls back on her emotional responses. With no scientific data to support her opinion, she claims that reading words on the Internet is not "reading" in the true sense -- that it does not activate the deep mental processes that are required to read, for example, a daily newspaper.
Periods of intense cultural change are always disturbing, and there is no doubt that we are now in the midst of an upheaval at least as intense as those that occurred in the early modern period, with the invention of the printing press, and during the 18th century, with the rise of the daily newspaper. But whether online or off, people are still reading, and the jury is still out on whether the Internet will cause people to read and write more or less, and on whether reading a newspaper is intrinsically "better," more mentally engaging, than reading a news blog. The fact that newspapers were the primary mode of news dissemination for the last two centuries does not mean that they are necessarily the best mode, or that they are irreplaceable.
Jacoby's contention that the advent of television and the eclipse of print culture by the culture of video corrupted and eventually destroyed the aspirational middlebrow culture of the last century is equally debatable. She mourns the decline of the Book-of-the-Month Club, for instance, without mentioning the remarkable recent florescence of book clubs in general and Oprah's in particular. She looks back with nostalgia to the period when every middle-class household had copies of Will Durant's histories on their bookshelf -- but how many owners of those books actually read them? Remembering my own youth, it seems to me that these volumes usually served more as props or showpieces than as well-thumbed reference books. And Jacoby fails to address the fact that book sales are high today, more books are being published than ever before, and the rise of Internet shopping has made book buying easier and cheaper than one would have thought possible only a decade ago.
Jacoby, then, is capable of comprehending the big picture when she looks at the American past -- her discussions of 18th- and 19th-century issues are especially enlightening -- but she has not succeeded in doing so in her analysis of the current cultural scene, and her frequently valuable observations are too often adulterated by the standard "what's the world coming to" rant of the older generation. --Brooke Allen
Brooke Allen is the author of Twentieth-Century Attitudes; Artistic License; and Moral Minority. She is a contributor to The New York Times Book Review, The New Criterion, The New Leader, The Hudson Review, and The Nation, among others. She was named a finalist for the 2007 Nona Balakian Citation for Excellence in Reviewing from the National Book Critics Circle.
From the Publisher
"Electric with fearless interpretation and fueled by passionate concern...brilliant, incendiary, and, one hopes, corrective." Booklist Starred Review
Read an Excerpt
The Way We Live Now: Just Us Folks
The word is everywhere, a plague spread by the President of the United States, television anchors, radio talk show hosts, preachers in megachurches, self-help gurus, and anyone else attempting to demonstrate his or her identification with ordinary, presumably wholesome American values. Only a few decades ago, Americans were addressed as people or, in the more distant past, ladies and gentlemen. Now we are all folks. Television commentators, apparently confusing themselves with the clergy, routinely declare that “our prayers go out to those folks”—whether the folks are victims of drought, hurricane, flood, child molestation, corporate layoffs, identity theft, or the war in Iraq (as long as the victims are American and not Iraqi). Irony is reserved for fiction. Philip Roth, in The Plot Against America—a dark historical reimagining of a nation in which Charles Lindbergh defeats Franklin D. Roosevelt in the 1940 presidential election—confers the title “Just Folks” on a Lindbergh program designed to de-Judaize young urban Jews by sending them off to spend their summers in wholesome rural and Christian settings.
While the word “folks” was once a colloquialism with no political meaning, there is no escaping the political meaning of the term when it is reverently invoked by public officials in twenty-first-century America. After the terrorist bombings in London on July 7, 2005, President Bush assured Americans, “I’ve been in contact with our homeland security folks and I instructed them to be in touch with local and state officials about the facts of what took place here and in London and to be extra vigilant as our folks start heading to work.” Bush went on to observe that “the contrast couldn’t be clearer, between the intentions of those of us who care deeply about human rights and human liberty, and those who’ve got such evil in their heart that they will take the lives of innocent folks.” Those evil terrorists. Our innocent folks. Even homeland security officials, who—one lives in hope—are supposed to be highly trained experts, cannot escape the folkish designation. All of the 2008 presidential contenders pepper their speeches with appeals to folks, but only John Edwards, who grew up poor in North Carolina, sounds as if he was raised around people who actually used the word in everyday conversation. Every time Hillary Rodham Clinton, brought up in a conservative Republican household in an upper-middle-class suburb of Chicago, utters the word “folks,” she sounds like a hovering parent trying to ingratiate herself with her children’s friends by using teenage slang.
The specific political use of folks as an exclusionary and inclusionary signal, designed to make the speaker sound like one of the boys or girls, is symptomatic of a debasement of public speech inseparable from a more general erosion of American cultural standards. Casual, colloquial language also conveys an implicit denial of the seriousness of whatever issue is being debated: talking about folks going off to war is the equivalent of describing rape victims as girls (unless the victims are, in fact, little girls and not grown women). Look up any important presidential speech in the history of the United States before 1980, and you will not find one patronizing appeal to folks. Imagine: We here highly resolve that these folks shall not have died in vain . . . and that government of the folks, by the folks, for the folks, shall not perish from the earth. In the 1950s, even though there were no orators of Lincoln’s eloquence on the political scene, voters still expected their leaders to employ dignified, if not necessarily erudite, speech. Adlai Stevenson may have sounded too much like an intellectual to suit the taste of average Americans, but proper grammar and respectful forms of address were mandatory for anyone seeking high office.
The gold standard of presidential oratory for adult Americans in the fifties was the memory of Roosevelt, whose patrician accent in no way detracted from his extraordinary ability to make a direct connection with ordinary people. It is impossible to read the transcripts of FDR’s famous fireside chats and not mourn the passing of a civic culture that appealed to Americans to expand their knowledge and understanding instead of pandering to the lowest common denominator. Calling for sacrifice and altruism in perilous times, Roosevelt would no more have addressed his fellow citizens as folks than he would have uttered an obscenity over the radio. At the end of 1940, attempting to prepare his countrymen for the coming of war, the president spoke in characteristic terms to the public:
"Tonight, in the presence of a world crisis, my mind goes back eight years to a night in the midst of a domestic crisis . . . I well remember that while I sat in my study in the White House, preparing to talk to the people of the United States, I had before my eyes the picture of all those Americans with whom I was talking. I saw the workmen in the mills, the mines, the factories; the girl behind the counter; the small shopkeeper; the farmer doing his spring plowing; the widows and the old men wondering about their life’s savings. I tried to convey to the great mass of the American people what the banking crisis meant to them in their daily lives.
Tonight I want to do the same thing, with the same people, in this new crisis which faces America. . . .
We must be the great arsenal of democracy. For us this is an emergency as serious as war itself. We must apply ourselves to the task with the same resolution, the same sense of urgency, the same spirit of patriotism and sacrifice as we would show were we at war. . . .
As president of the United States I call for that national effort. I call for it in the name of this nation which we love and honor and which we are privileged and proud to serve. I call upon our people with absolute confidence that our common cause will greatly succeed."
Substitute folks for people, farmer, old men, and widows, and the relationship between the abandonment of dignified public speech and the degradation of the political process becomes clear. To call for resolution and a spirit of patriotism and sacrifice is to call upon people to rise above their everyday selves and to behave as true citizens. To keep telling Americans that they are just folks is to expect nothing special—a ratification and exaltation of the quotidian that is one of the distinguishing marks of anti-intellectualism in any era.
The debasement of the nation’s speech is evident in virtually everything broadcast and podcast on radio, television, and the Internet. In this true, all-encompassing public square, homogenized language and homogenized thought reinforce each other in circular fashion. As George Orwell noted in 1946, “A man may take to drink because he feels himself a failure, and then fail all the more completely because he drinks. It is rather the same thing that is happening to the English language. It becomes ugly and inaccurate because our thoughts are foolish, but the slovenliness of our language makes it easier for us to have foolish thoughts.” In this continuous blurring of clarity and intellectual discrimination, political speech is always ahead of the curve—especially because today’s media possess the power to amplify and spread error with an efficiency that might have astonished even Orwell. Consider the near-universal substitution, by the media and politicians, of “troop” and “troops” for “soldier” and “soldiers.” As every dictionary makes plain, the word “troop” is always a collective noun; the “s” is added when referring to a particularly large military force. Yet each night on the television news, correspondents report that “X troops were killed in Iraq today.” This is more than a grammatical error; turning a soldier—an individual with whom one may identify—into an anonymous-sounding troop encourages the public to think about war and its casualties in a more abstract way. Who lays a wreath at the Tomb of the Unknown Troop? It is difficult to determine exactly how, why, or when this locution began to enter the common language. Soldiers were almost never described as troops during the Second World War, except when a large military operation (like the Allied landing on D-Day) was being discussed, and the term remained extremely uncommon throughout the Vietnam era. My guess is that some dimwits in the military and the media (perhaps the military media) decided, at some point in the 1980s, that the word “soldier” implied the masculine gender and that all soldiers, out of respect for the growing presence of women in the military, must henceforth be called troops. Like unremitting appeals to folks, the victory of troops over soldiers offers an impressive illustration of the relationship between fuzzy thinking and the debasement of everyday speech.
By debased speech, I do not mean bad grammar, although there is plenty of that on every street corner and talk show, or the prevalence of obscene language, so widespread as to be deprived of force and meaning at those rare times when only an epithet will do. Nor am I talking about Spanglish and so-called Black English, those favorite targets of cultural conservatives—although I share the conservatives’ belief that public schools ought to concentrate on teaching standard English. But the standard of standard American English, and the ways in which private speech now mirrors the public speech emanating from electronic and digital media, is precisely the problem. Debased speech in the public square functions as a kind of low-level toxin, imperceptibly coarsening our concept of what is and is not acceptable until someone says something so revolting—Don Imus’s notorious description of female African-American college basketball players as “nappy-headed hos” is the perfect example—that it produces a rare, and always brief, moment of public consciousness about the meaning and power of words. Predictably, the Imus affair proved to be a missed opportunity for a larger cultural conversation about the level of all American public discourse and language. People only wanted to talk about bigotry—a worthy and vital conversation, to be sure, but one that quickly degenerated into a comparative lexicon of racial and ethnic victimology. Would Imus have been fired for calling someone a faggot or a dyke? What if he had only called the women hos, without the additional racial insult of nappy-headed? And how about Muslims? Didn’t Ann Coulter denigrate them as “ragheads” (a slur of which I was blissfully unaware until an indignant multiculturalist reported it on the op-ed page of The New York Times). The awful reality is that all of these epithets, often accompanied by the F-word, are the common currency of public and private speech in today’s America. They are used not only because many Americans are infected by various degrees of bigotry but because nearly all Americans are afflicted by a poverty of language that cheapens humor and serious discourse alike. The hapless Imus unintentonially made this point when he defended his remarks on grounds that they had been made within a humorous context. “This is a comedy show,” he said, “not a racial rant.” Wrong on both counts. Nothing reveals a lack of comic inventiveness more reliably than the presence of reflexive epithets, eliciting snickers not because they exist within any intentional “context” but simply because they are crass words that someone is saying out loud.
Part of Imus’s audience was undoubtedly composed of hard-core racists and misogynists, but many more who found his rants amusing were responding in the spirit of eight-year-olds laughing at farts. Imus’s “serious” political commentary was equally pedestrian. He frequently enjoined officials who had incurred his displeasure to “just shut up,” displaying approximately the same level of sophistication as Vice President Dick Cheney when he told Senator Patrick J. Leahy on the Senate floor, “Go fuck yourself.” As the genuinely humorous Russell Baker observes, previous generations of politicians (even if they had felt free to issue the physically impossible Anglo-Saxon injunction in a public forum) would have been shamed by their lack of verbal inventiveness. In the 1890s, Speaker of the House Thomas Reed took care of one opponent by observing that “with a few more brains he could be a halfwit.” Of another politician, Reed remarked, “He never opens his mouth without subtracting from the sum of human intelligence.” Americans once heard (or rather, read) such genuinely witty remarks and tried to emulate that wit. Today we parrot the witless and halfwitted language used by politicians and radio shock jocks alike.
The mirroring process extends far beyond political language, which has always existed at a certain remove from colloquial speech. The toxin of commercially standardized speech now stocks the private vault of words and images we draw on to think about and to describe everything from the ridiculous to the sublime. One of the most frequently butchered sentences on television programs, for instance, is the incomparable Liberace’s cynically funny, “I cried all the way to the bank”—a line he trotted out whenever serious critics lambasted his candelabra-lit performances as kitsch. The witty observation has been transformed into the senseless catchphrase, “I laughed all the way to the bank”—often used as a non sequitur after news stories about lottery winners. In their dual role as creators of public language and as microphones amplifying and disseminating the language many American already use in their daily lives, the media constitute a perpetuum mobile, the perfect example of a machine in which cause and effect can never be separated. A sports broadcaster, speaking of an athlete who just signed a multi-year, multi-million-dollar contract, says, “He laughed all the way to the bank.” A child idly listening—perhaps playing a video game on a computer at the same time—absorbs the meaningless statement without thinking and repeats it, spreading it to others who might one day be interviewed on television and say, “I laughed all the way to the bank,” thereby transmitting the virus to new listeners. It is all reminiscent of the exchange among Alice, the March Hare, and the Mad Hatter in Alice’s Adventures in Wonderland. “Then you should say what you mean,” the March Hare tells Alice. “ ‘I do,’ Alice hastily replied; ‘at least—at least I mean what I say—that’s the same thing, you know.’ ” The Hatter chimes in, “Not the same thing a bit! Why, you might just as well say that ‘I see what I eat’ is the same thing as ‘I eat what I see’!” In an ignorant and anti-intellectual culture, people eat mainly what they see.
 Franklin Delano Roosevelt, Fireside Chats (New York, 1995), pp. 48-49, 62-63.
 George Orwell, “Politics and the English Language,” Horizon, 76 (London: 1946); orwell.ru/library/essays/politics/english/e_polit.
 Robert Wright, “Shock Talk Without Apologies,” New York Times, April 14, 2007.
 Russell Baker, “Talking It Up,” New York Review of Books, May 11, 2006.
 Liberace first used this line in 1957, when he won a libel judgment against the British tabloid Daily Mirror, which published a column calling the entertainer a “deadly, winking, sniggering, snuggling, chromium-plated, scent-impregnated, luminous, giggling, fruit-flavored, mincing, ice-covered heap of mother love.” The British court concluded that the article had libelously implied that Liberace was a homosexual (which, of course he was, but there was no proof).
From the Hardcover edition.