- Shopping Bag ( 0 items )
Living Free in a World of Lies, Hype & Spin
Copyright © 2002 Os Guiness
All right reserved.
BACK TO THE
MORAL STONE AGE
It was a warm California night, but the professor felt "shivery, chilled to the bone." The outrageous had outraged no one. The unthinkable was being thought all too thoughtlessly. After all her years as a veteran of California classrooms, she had heard what she never expected to hear. Her response was profound dismay: "No one in the whole class of twenty ostensibly intelligent individuals would go out on a limb and take a stand against human sacrifice."
The short story under discussion was Shirley Jackson's "The Lottery." Set in a small town somewhere in rural America, the townsfolk are gathering for some ritual obviously critical for the well being of the crops and the community. At the center of everyone's thoughts is the lottery. The buildup to the draw gathers momentum from Jackson's skillful weaving of the down-to-earth realism of the story with the mounting suspense of the unnamed event about to unfold.
Suddenly, in a stunning denouement, everything becomes grotesquely clear. The draw is for a human sacrifice. Tessie Hutchinson, wife, mother, and neighbor, chooses the slip of paper with the black spot.Instantly she finds herself isolated in the center of a cleared space. Even her son, little Davy, has pebbles in his hand.
"Come on, come on, everyone," Old Man Warner urges the villagers—and they do.
"It isn't fair, it isn't right," Mrs. Hutchinson screams, but there is no stopping the ritual. The story ends with a sickening thud: "and then they were upon her."
When the New Yorker first published "The Lottery" in 1948, it was deluged by letters in a storm of outrage. In the robustly moral climate of victorious, postwar America, the very idea of conformity to such a ritual was outrageous. Human sacrifice was unthinkable—it simply couldn't happen in America. Late-1940s' America may have been far too conformist for the later 1960s' generation, but the story's moral—the dangers of "going along" in blind social conformity—found a passionate response in the generation that had stood up to Hitler.
But times change. Since the publication of the story, countless high-school classes have read and discussed it, and the reactions it triggers have changed along with the students. In fact, student responses give a remarkably accurate seismograph reading of the wider shifts in society over the years.
It was these changes that Kay Haugaard, professor in southern California, had noted in her students from more than two decades of teaching creative writing. Starting in 1970, she observed that her first students—ranging from an occasional eighteen-year-old to an occasional eighty-year-old—were still "shocked into giggles or frowns at the sound of naughty words" in either the published stories or students' work.
Slowly the writing and the responses changed. The shift started with the increased violence of the stories of Vietnam veterans that talked of killing, maiming, brutal deaths, and bizarre sexual encounters with Vietnamese prostitutes. Then, successively, it was the turn of homosexual narratives, lesbian testimonies, and varied writings on civil rights, sexual liberation, and multiculturalism. Gradually the explicitness of the anger, victimhood, and lewdness led to a noticeable coarsening of the writing and a jadedness of the responses.
Yet, during all those years, "The Lottery" was one story that had always managed to elicit a strong moral response. The tale was so well told, the moral so powerful, and ritual so shocking that at least this story could engage the students' sense of right and wrong.
Until the 1990s, that is. One night Haugaard encountered a class that registered no moral response at all. "The end was neat!" one woman said. "It was all right. It wasn't that great," another repeated. "They just do it," yet another argued, "It's their ritual."
Haugaard's concern mounted dramatically as the unconcern deepened in the room. "I was stunned," she later wrote after questioning Beth, a stylish student in her forties, that "this was the woman who wrote so passionately about saving the whales, of concern for the rain forests, of her rescue and tender care for a stray dog."
But more was to come. Another student, Richard, put forward a psychological theory espousing the social value of a certain amount of bloodshed. "It almost seems a need," he concluded in cool, reasonable tones.
Finally, after Haugaard had broken her normal custom and expressed her own moral position forcefully, a nurse in her fifties summed up the discussion: "Well, I teach a course for our hospital personnel in multicultural understanding, and if it's a part of a person's culture, we are taught not to judge, and if it has worked for them...."
"At this point I gave up," Haugaard reported in The Chronicle of Higher Education. "No one in the whole class of more than twenty ostensibly intelligent individuals would go out on a limb and take a stand against human sacrifice."
Thou Shalt not Judge
Kay Haugaard's account is profoundly disturbing for both moral and political reasons, but it is hardly surprising. In large parts of the West, the crisis of truth has spawned two other companion crises: one in ethics, which we will examine in this chapter, and another in character, which we will look at in the next. The sources of these crises lie in intellectual and social roots respectively.
As the reaction to "The Lottery" reveals, the late-1960s slogan painted on the wall at the Sorbonne—"It is forbidden to forbid"—now covers thoughts and analysis as well as actions. Censuring is commonly confused with censoring and moral judgment has been paralyzed. Many discussions are now void of a backstop crime or evil that will elicit universal condemnation. Even the Holocaust is increasingly "personally deplored" but not morally condemned.
Earlier, playwright Bertold Brecht claimed that the eleventh commandment of the modern world was "Be good to yourself." Today it has been replaced by a new candidate: "Thou shalt not judge." In such a world, what follows is simple: When nothing can be judged except judgment itself—"judgmentalism"—the barriers between the unthinkable, acceptable, and doable collapse entirely. And then, since life goes on and the sky doesn't fall, people draw the conclusion that the original concern was unfounded. Lighten up, the newly amoral say as they skip forward blithely, complicit in their own corruption.
But what is surprising, perhaps, is that the contemporary crisis of truth, ethics, and character is playing out in the midst of an extraordinary resurgence of interest in ethics around the Western world. For instance, a survey by the Hastings Center claims that the United States now offers over eleven thousand courses in applied ethics that tackle all manner of ethical problems in business, politics, medicine, science, engineering, and social work. These courses are backed by thousands of experts through hundreds of textbooks, dozens of journals, and some staggeringly generous financial grants.
Some people interpret this explosion of interest in ethics as the return to a morally robust era. Right up to the end of the nineteenth century, the most important course in an American student's college career was moral philosophy, or what we today call ethics. The course was seen as the crowning unit in the senior year, usually taught by the college president himself. As President James Monroe said of such classes, "The question to be asked at the end of an educational step is not 'What has the student learned?' but 'What has the student become?'"
At a different level, both MTV and the New York Times in the nineties took up the topic of the seven deadly sins. Surely, it is said, we are witnessing the beginning of an ethical revival.
Beneath the Surface
Ours an age of deepened morality? Far from it. In fact, this renewed enthusiasm for ethics is hardly cause for celebration. For one thing, morality is like health—preoccupation with it is often a sign of illness, not vitality. For another thing, a closer look at the resurgence is not so reassuring.
First, part of the renewed interest is simply fashionable and transient. As one commentator put it, "In our low-fat, low-conscience culture, Sin-Lite has found shelf space alongside other low-guilt pleasures." Or in the words of MTV, "A little lust, pride, sloth, and gluttony—in moderation—are fun, and that's what keeps your heart beating."
Second, much of today's focus is on "prevention ethics" rather than on principled ethics. It is more concerned with "not being caught" (or sued or exposed in the press) than with doing right. Besides, what Oscar Wilde said cynically a century ago is uncomfortably apt in the climate of today's culture wars: "Morality is simply the attitude we adopt toward people we personally dislike."
Third, even where good ethics is taught in a good way, it is usually more social in nature than personal. That is, what matters for the politically correct is to hold the right views, not to practice them. What is seen as important are issues related to corporations, schools, courts, governments, and the treatment of the environment—not the individual's virtue and responsibility that underlie these secondary issues.
As ethicist Christina Sommers writes, "A glance at a typical anthology of a college course in ethics reveals that most of what the student will read is directed toward analyzing and criticizing policies on such issues as punishment, recombinant DNA research, abortion, and euthanasia .... Inevitably the student gets the idea that applying ethics to modern life is mainly a matter of being for or against some social policy."
Fourth, and worse still, the current ethics is often taught with a shallow view of human nature and an even more superficial view of evil in human society. For example, such topics as hypocrisy, self-deception, selfishness, and cruelty rarely come up. And the place of envy in politics, greed in the economy, lust in the fashion industry, and violence in the entertainment business is rarely probed.
Fifth, and worst of all, the present preoccupation with ethics in elite intellectual centers has an element of absurdity because they have no moral content left to teach. The fruit of the Western universities in the last two hundred years has been to destroy the possibility of any moral knowledge on which to pursue moral formation.
President Derek Bok of Harvard expressed this point guardedly: "Today's course in applied ethics does not seek to convey a set of moral truths, but tries to encourage the student to think carefully about complex moral issues.... The principal aim of the course is not to impart 'right answers' but to make the student more perceptive in detecting ethical problems when they arise."
Philosopher Dallas Willard was more blunt: Had President Bok "strolled across Harvard Yard to Emerson Hall and consulted with some of the most influential thinkers in our nation, he would have discovered that there now is no recognized moral knowledge upon which projects of fostering moral development could be based. There is now not a single moral conclusion about behavior or character traits that a teacher could base a student's grade on—not even those most dear to educators, concerning fairness and diversity."
With no moral conclusions left, all that remains is evermore clever talk about ethics—and even that is complicated and controversial. As Bertrand Russell observed far earlier in the century: "Where ethics is concerned, I hold that, so far as fundamentals are concerned, it is impossible to produce conclusive intellectual arguments.... In a fundamental question of ethics I do not think a theoretical argument is possible."
This result is shocking for a civilization that once took pride in progress and moral reform—reform by an accepted moral standard by which such evils as persecution and chattel slavery were pronounced unacceptable and wrong. Technologically Western civilization has advanced to the age of space and cyberspace, but ethically we have regressed. "We have been thrown back," Christina Sommers writes, "into a moral Stone Age; many young people are totally unaffected by thousands of years of moral experience and moral progress."
"Grey is the color of truth," George McBundy wrote in defense of the Vietnam War. Only a few decades later the grey has deepened to near-black. Today many people feel that trying to navigate the moral confusion of modern society is like driving through London or New York without a map and with all the traffic lights turned off.
Defining Deviancy Down
What lies behind this widespread moral confusion? One factor is simple: a lack of serious analysis of why we have an ethics crisis in the first place, which, in turn, reinforces the obvious shortcomings in the so-called revival of ethics. Most attention is directed at symptoms, not causes; few people dig down to the roots.
For example, few understand that the United States, because of the convictions of its founders, is a nation with a realistic view of evil embedded in its constitutional checks and balances. Yet psychologist Karl Menninger's 1973 book, Whatever Became of Sin? was not only a startling title but a sobering benchmark to gauge the slippage from the founders' position. The notion of evil, Menninger argued, had slid from being "sin" defined theologically, to being "crime" defined legally, to being "sickness" defined only in psychological categories. In Senator Daniel Patrick Moynihan's more recent analysis, Americans have "defined deviancy down." What was "deviant" fifty years ago is today just par for the course.
The wider moral confusion in the West (similar problems exist in other modern countries) can be probed best with the help of three terms—"permissive," "transgressive," and "remissive." Fyodor Dostoyevsky captured the first a hundred years ago in his famous refrain in The Brothers Karamazov: If God is dead, and there is no future life, "nothing would be immoral any longer, everything would be permitted."
The developments of a century later show that the consequences are not limited to morals. If "God is dead," all sorts of other things die too, including truth, selfhood, character, the power of words to describe reality, and for some people even reality itself. For the skeptic who pushes, there is no stopping point—at least in thought.
The second term, transgressive, found its classic expression in the 1968 Sorbonne slogan mentioned earlier, "It is forbidden to forbid." This was later popularized by basketball player Denis Rodmann as "bad as I wanna be" and exemplified by the choreographer of rock star Madonna: "Madonna told me to break every rule I could think of, and then when I was done to make up some new ones and break them."
Of course, there are limits to transgressing, which under postmodern conditions becomes the exhibitionism of transgressing—chaos for society and boredom for the transgressor. For instance, shock-rocker Marilyn Manson recently complained, "We can't go any further without starting over.... What other violence can you show? What other drug can you do? What other thing can you get pierced? It's all been done." But even short of this point, the costs of the moral vandalism are enormous, not least because the transgressors wreak their havoc in the name of specious freedom that others desire to copy.
The third term, remissive, comes up repeatedly when people try to describe the moral crisis and grope for terms that capture its overwhelming and snowballing nature. Society, they say, is "eroding," "unraveling," "fraying," "melting down," and the like. Or, as one political leader expressed it during the scandals swirling around the Clinton presidency, "You can stop a flood by putting your finger in the dike, but how do you stop a mudslide?"
Shots from the Old Artilleryman
The most powerful philosophical source of the crisis of truth is the writings of Friedrich Nietzsche in the 1880s. Weak, sickly, physically shortsighted, racked by tremendous pain, little known and less read, Nietzsche in those years shuttled between Torino in Italy, Ezes-sur-Nice in France, and Sils Maria, near the modern ski resort of St. Moritz, in Switzerland. In the few hours of the days he was able to write, he poured out a series of books so revolutionary that the twentieth century has been described as a footnote on his thought.
The self-proclaimed "immoralist," "anti-Christ," and "conqueror of God," Nietzsche ranked his Zarathustra as "a fifth Gospel" and styled himself the "teacher of mistrust of truth." He also called himself the "old artilleryman" because of brief service in the Franco-Prussian War and described his way of analysis as doing philosophy "with a hammer."
Clearly he never suffered from false modesty. "I know my fate," he wrote in Ecce Homo. "One day my name will be associated with the memory of ... a crisis without equal on earth, the most profound collision of conscience, a decision that was conjured up against everything that had been believed, demanded, hallowed so far. I am no man. I am dynamite."
More prosaically, Nietzsche mounted a furious assault on the traditional view of truth and ethics from two sides. From one better-known side he relativized truth through his notion of "perspectivism": "There are many kinds of eyes, and consequently there are many kinds of 'truths,' and consequently there is no truth."
And from the other less-known side he rationalized truth through his notion of the "genealogy of morals." He claimed that truth and virtue, if their family trees were traced back, were far from self-evident, straightforward, and noble, but were rooted in resentments and ignoble vices. Thus "truth," he said, was a mask for the will to power; "pity" was the poison of resentment; and "virtue" a pious form of hypocrisy whereby the "slave class" gained its revenge on the "master class," the "herd" on the "hero." In short, "faith" is servile, sniveling, and insincere, a sanctimonious mask to cover a can of emotional worms.
In Nietzsche's final writings, especially The Will to Power, power was central to everything. "This world is the will to power—and nothing besides! And you yourselves are also this will to power—and nothing besides!" To feel their intoxication, his late writings need to be read when alone, unhurried, and preferably in a majestic setting like the Alps or the Rockies. Little wonder so many have read Nietzsche and been drawn to their destruction like moths to a flame. What extreme sports are to today's thrill-seekers, Nietzsche is to the rare mind and spirit—the extreme philosopher whose most audacious thoughts were written at the very rim of his own sanity before he plunged over.
After all, Nietzsche writes in Beyond Good and Evil, the higher truths are only for heroes and are considerably dangerous: "Indeed, it may be a characteristic of existence that those who would know it completely would perish, in which case the strength of a spirit should be measured according to how much of the 'truth' one could still barely endure—or to put it more clearly, to what degree one would require it to be thinned down, shrouded, sweetened, falsified."
Traditionally, power without wisdom and virtue has been viewed as dangerous. In The Wild Ass's Skin, Honoré de Balzac observed that possessing power does not mean knowing how to use it. "A sceptre is a toy for a child, an axe for a Richelieu, and for Napoleon a lever with which to move the earth. Power leaves our natures untouched and confers greatness only on the great." Winston Churchill wrote similarly, reflecting on World War II, "Power, for the sake of lording it over fellow creatures or adding to personal pomp, is rightly judged base."
In stark contrast, today's postmodernists who follow Nietzsche view power as everything. From philosophy to literary theory to evolutionary psychology to legal studies to law courts to the airwaves and election campaigns, everything is power. All else is flummery and illusion.
Later, in chapter four, we will set out the strategies for responding to these assaults. Here it is important only to underscore how deadly Nietzsche's attacks have proved. Far from assailing one truth-claim or one virtue in the name of another (which has the effect of reinforcing the importance of truth and virtue themselves), he undermines the very notions.
Not only the possibility but the worthwhileness of truth and virtue are emptied of meaning. Whatever someone may profess, things are always other than they pretend, darker and murkier than they make out. Our proper response, we are taught, should be to view every claim with a sense of irony, interpret everything with suspicion, and pursue "truth" and "virtue" with the central agenda of unmasking and dismantling them.
Passing through the Fiery Brook
Nietzsche's philosophical assault on truth and ethics is deadly enough, but it has been joined by an equally lethal assault from the social sciences. Many claim that the discipline of the sociology of knowledge (my own discipline) demonstrates that all human knowledge is not only relative but "socially constructed" and nothing more. That is, what we know is so shaped by our social context that any claim to be true or false, right or wrong, is patently absurd. "Truth" is only a matter of human convention or social construction.
It is important to say that the best proponents of the sociology of knowledge do not see the discipline as ruling out the importance or possibility of truth. But as popularly understood—in other words, as misunderstood—that is the effect. The reason is obvious: The history of ideas proceeds by tracing a line from thinkers to thoughts to their impact on the world—"Ideas have consequences." By contrast, the sociology of knowledge does the opposite, tracing the line from social context to thoughts to thinkers—"Culture shapes ideas."
A simple example is the modern attitude toward time. Asians used to say that Westerners are "people with gods on their wrists," while Africans observed that "all Westerners have watches; no Westerners have time." Many modern expressions, such as "time is money," "buying time," "quality time," and "opportunity costs," attest to this same decisive and almost inescapable sense of modern time—simultaneously pressured and precious.
Yet this modern view of time cannot be traced to any single thinker or school of thought. Rather our so-called "clock culture" is just that, the fruit of our modern world of clocks and watches, especially as synchronized with the whole of life in the Industrial Revolution and most especially as accelerated by the "instant, total capacity" of modern information technology. In this sense our modern view of time is "socially constructed."
But does the sociology of knowledge claim that all knowledge is "socially constructed" and nothing more—in other words, not true? Emphatically not. The best sociologists of knowledge—supremely Peter L. Berger—do not presume to make this judgment. Instead they bracket the question of truth as being outside the jurisdiction of sociology and pass it on to philosophy. They point out that for sociology to judge its own truthfulness would be self-defeating—as Berger says, "It would be like trying to push a bus on which one was riding."
The sociology of knowledge, then, should not presume to make a judgment on the truth of beliefs. Rather, its province is to analyze the social context of whatever passes for knowledge and leave the question of truthfulness to others. There are more tools in the tool box of understanding than this one.
Such modesty and clear thinking, however, has not characterized those (usually not sociologists of knowledge) who brandish the term "socially constructed" and apply it to everything. In their careless hands the term is used falsely to proclaim that nothing is true, everything is only socially constructed. For instance, Blaise Pascal's brilliant observation on how truth is different on different sides of the Pyrenees becomes a blunt, bulldozing attack on all truth and even on what is meant by the Pyrenees.
Not surprisingly, when the sociology of knowledge is misunderstood and misapplied, it compounds the relativism of what is already, in Berger's description, "an intrinsically debunking discipline" that "raises the vertigo of relativity to its most furious pitch." Punning on the name of Ludwig Feuerbach (the debunking philosopher), Berger calls it the "fiery brook" through which thinking today must pass.
This dramatic language is well-justified, but skepticism won't always be wielded with the intensity of a Nietzschean "Superman." A generation that sooner or later "dumbs down" everything to bumper stickers and Hallmark cards can be counted on for user-friendly versions of Nietzscheanism. A current California bumper sticker translates Nietzsche for the Sunshine State: "There is no right or wrong—only fun or boring."
Christina Sommers tells of an undergraduate at Williams College who remarked innocently after hearing that all knowledge is socially constructed: "Although the Holocaust may not have happened, it's a perfectly reasonable conceptual hallucination."
Using Nietzsche's terms, our danger comes not just from the "supermen" but from the "last men." Losing touch with transcendence, secular people would lose a reference point with which to judge themselves and would end up confusing health with happiness and happiness with health. "One has one's little pleasure for the day and one's little pleasure for the night," Nietzsche commented in Thus Spake Zarathustra. "But one has a regard for health. 'We have invented happiness,' say the last men, and they blink."
Excerpted from Time for Truth by Os Guiness Copyright © 2002 by Os Guiness. Excerpted by permission.
All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.
Excerpts are provided by Dial-A-Book Inc. solely for the personal use of visitors to this web site.
|Introduction: But Not Through Me||9|
|1||Back to the Moral Stone Age||21|
|2||We're All Spinmeisters Now||35|
|3||The West Versus Itself||49|
|4||Differences Make a Difference||69|
|5||Turning the Tables||89|
|6||On Record against Ourselves||105|
|For Further Reading||127|