- Shopping Bag ( 0 items )
"The questions he poses about the relationship between technical change and political power are pressing ones that can no longer be ignored, and identifying them is perhaps the most a nascent 'philosophy of technology' can expect to achieve at the present time."—David Dickson, New York Times Book Review
"The Whale and the Reactor is the philosopher's equivalent of superb public history. In its pages an analytically trained mind confronts some of the most pressing political issues of our day."—Ruth Schwartz Cowan, Isis
From the early days of manned space travel comes a story that exemplifies what is most fascinating about the human encounter with modern technology. Orbiting the earth aboard Friendship 7 in February 1962, astronaut John Glenn noticed something odd. His view of the planet was virtually unique in human experience; only Soviet pilots Yuri Gagarin and Gherman Titov had preceded him in orbital flight. Yet as he watched the continents and oceans moving beneath him, Glenn began to feel that he had seen it all before. Months of simulated space shots in sophisticated training machines and centifuges had affected his ability to respond. In the words of chronicler Tom Wolfe, "The world demanded awe, because this was a voyage through the stars. But he couldn't feel it. The backdrop of the event, the stage, the environment, the true orbit ... was not the vast reaches of the universe. It was the simulators. Who could possibly understand this?" Synthetic conditions generated in the training center had begun to seem more "real" than the actual experience.
It is reasonable to suppose that a society thoroughly committed to making artificial realities would havegiven a great deal of thought to the nature of that commitment. One might expect, for example, that the philosophy of technology would be a topic widely discussed by scholars and technical professionals, a lively field of inquiry often chosen by students at our universities and technical institutes. One might even think that the basic issues in this field would be well defined, its central controversies well worn. However, such is not the case. At this late date in the development of our industrial/technological civilization the most accurate observation to be made about the philosophy of technology is that there really isn't one.
The basic task for a philosophy of technology is to examine critically the nature and significance of artificial aids to human activity. That is its appropriate domain of inquiry, one that sets it apart from, say, the philosophy of science. Yet if one turns to the writings of twentieth-century philosophers, one finds astonishingly little attention given to questions of that kind. The six-volume Encyclopedia of Philosophy, a recent compendium of major themes in various traditions of philosophical discourse, contains no entry under the category "technology." Neither does that work contain enough material under possible alternative headings to enable anyone to piece together an idea of what a philosophy of technology might be.
True, there are some writers who have taken up the topic. The standard bibliography in the philosophy of technology lists well over a thousand books and articles in several languages by nineteenth- and twentieth-century author. But reading through the material listed shows, in my view, little of enduring substance. The best writing on this theme comes to us from a few powerful thinkers who have encountered the subject in the midst of much broader and ambitious investigations-for example, Karl Marx in the development of his theory of historical materialism or Martin Heidegger as an aspect of his theory of ontology. It may be, in fact, that the philosophy is best seen as a derivative of more fundamental questions. For despite the fact that nobody would deny its importance to an adequate understanding of the human condition, technology has never joined epistemology, metaphysics, esthetics, law, science, and politics as a fully respectable topic for philosophical inquiry.
Engineers have shown little interest in filling this void. Except for airy pronouncements in yearly presidential addresses at various engineering societies, typically ones that celebrate the contributions of a particular technical vocation to the betterment of humankind, engineers appear unaware of any philosophical questions their work might entail. As a way of starting a conversation with my friends in engineering, I sometimes ask, "What are the founding principles of your discipline?" The question is always greeted with puzzlement. Even when I explain what I am after, namely, a coherent account of the nature and significance of the branch of engineering in which they are involved, the question still means nothing to them. The scant few who raise important first questions about their technical professions are usually seen by their colleagues as dangerous cranks and radicals. If Socrates' suggestion that the "unexamined life is not worth living" still holds, it is news to most engineer.
Why is it that the philosophy of technology has never really gotten under way? Why has a culture so firmly based upon countless sophisticated instruments, techniques, and systems remained so steadfast in its reluctance to examine its own foundations? Much of the answer can be found in the astonishing hold the idea of "progress" has exercised on social thought during the industrial age. In the twentieth century it is usually taken for granted that the only reliable sources for improving the human condition stem from new machines, techniques, and chemicals. Even the recurring environmental and social ills that have accompanied technological advancement have rarely dented this faith. It is still a prerequisite that the person running for public office swear his or her unflinching confidence in a positive link between technical development and human well-being and affirm that the next wave of innovations will surely be our salvation.
There is, however, another reason why the philosophy of technology has never gathered much steam. According to conventional views, the human relationship to technical things is too obvious to merit serious reflection. The deceptively reasonable notion that we have inherited from much earlier and less complicated times divides the range of possible concerns about technology into two basic categories: making and use. In the first of these our attention is drawn to the matter of "how things work" and of "making things work." We tend to think that this is a fascination of certain people in certain occupations, but not for anyone else. "How things work" is the domain of inventors, technicians, engineers, repairmen, and the like who prepare artificial aids to human activity and keep them in good working order. Those not directly involved in the various spheres of "making" are thought to have little interest in or need to know about the materials, principles, or procedures found in those spheres.
What the others do care about, however, are tools and uses. This is understood to be a straightforward matter. Once things have been made, we interact with them on occasion to achieve specific purposes. One picks up a tool, uses it, and puts it down. One picks up a telephone, talks on it, and then does not use it for a time. A person gets on an airplane, flies from point A to point B, and then gets off. The proper interpretation of the meaning of technology in the mode of use seems to be nothing more complicated than an occasional, limited, and nonproblematic interaction.
The language of the notion of "use" also includes standard terms that enable us to interpret technologies in a range of moral contexts. Tools can be "used well or poorly" and for "good or bad purposes"; I can use my knife to slice a loaf of bread or to stab the next person that walks by. Because technological objects and processes have a promiscuous utility, they are taken to be fundamentally neutral as regards their moral standing.
The conventional idea of what technology is and what it means, an idea powerfully reinforced by familiar terms used in everyday language, needs to be overcome if a critical philosophy of technology is to move ahead. The crucial weakness of the conventional idea is that it disregards the many ways in which technologies provide structure for human activity. Since, according to accepted wisdom, patterns that take shape in the sphere of "making" are of interest to practitioners alone, and since the very essence of "use" is its occasional, innocuous, nonstructuring occurrence, any further questioning seems irrelevant.
If the experience of modern society shows us anything, however, it is that technologies are not merely aids to human activity, but also powerful forces acting to reshape that activity and its meaning. The introduction of a robot to an industrial workplace not only increases productivity, but often radically changes the process of production, redefining what "work" means in that setting. When a sophisticated new technique or instrument is adopted in medical practice, it transforms not only what doctors do, but also the ways people think about health, sickness, and medical care. Widespread alterations of this kind in techniques of communication, transportation, manufacturing, agriculture, and the like are largely what distinguishes our times from early periods of human history. The kinds of things we are apt to see as "mere" technological entities become much more interesting and problematic if we begin to observe how broadly they are involved in conditions of social and moral life.
It is true that recurring patterns of life's activity (whatever their origins) tend to become unconscious processes taken for granted. Thus, we do not pause to reflect upon how we speak a language as we are doing so or the motions we go through in taking a shower. There is, however, one point at which we may become aware of a pattern taking shape-the very first time we encounter it. An opportunity of that sort occurred several years ago at the conclusion of a class I was teaching. A student came to my office on the day term papers were due and told me his essay would be late. "It crashed this morning," he explained. I immediately interpreted this as a "crash" of the conceptual variety, a flimsy array of arguments and observations that eventually collapses under the weight of its own ponderous absurdity. Indeed, some of my own papers have "crashed" in exactly that manner. But this was not the kind of mishap that had befallen this particular fellow. He went on to explain that his paper had been composed on a computer terminal and that it had been stored in a time-sharing minicomputer. It sometimes happens that the machine "goes down" or "crashes," making everything that happens in and around it stop until the computer can be "brought up," that is, restored to full functioning.
As I listened to the student's explanation, I realized that he was telling me about the facts of a particular form of activity in modern life in which he and others similarly situated were already involved and that I had better get ready for. I remembered J. L. Austin's little essay "A Plea for Excuses" and noticed that the student and I were negotiating one of the boundaries of contemporary moral life-where and how one gives and accepts an excuse in a particular technology-mediated situation. He was, in effect, asking me to recognize a new world of parts and pieces and to acknowledge appropriate practices and expectations that hold in that world. From then on, a knowledge of this situation would be included in my understanding of not only "how things work" in that generation of computers, but also how we do things as a consequence, including which rules to follow when the machines break down. Shortly thereafter I got used to computers crashing, disrupting hotel reservations, banking, and other everyday transactions; eventually, my own papers, began crashing in this new way.
Some of the moral negotiations that accompany technological change eventually become matters of law. In recent times, for example, a number of activities that employ computers as their operating medium have been legally defined as "crimes." Is unauthorized access to a computerized data base a criminal offense? Given the fact that electronic information is in the strictest sense intangible, under what conditions is it "property" subject to theft? The law has had to stretch and reorient its traditional categories to encompass such problems, creating whole new classes of offenses and offenders.
The ways in which technical devices tend to engender distinctive worlds of their own can be seen in a more familiar case. Picture two men traveling in the same direction along a street on a peaceful, sunny day, one of them afoot and the other driving an automobile. The pedestrian has a certain flexibility of movement: he can pause to look in a shop window, speak to passersby, and reach out to pick a flower from a sidewalk garden. The driver, although he has the potential to move much faster, is constrained by the enclosed space of the automobile, the physical dimensions of the highway, and the rules of the road. His realm is spatially structured by his intended destination, by a periphery of more-or-less irrelevant objects (scenes for occasional side glances), and by more important objects of various kinds-moving and parked cars, bicycles, pedestrians, street signs, etc., that stand in his way. Since the first rule of good driving is to avoid hitting things, the immediate environment of the motorist becomes a field of obstacles.
Imagine a situation in which the two persons are next-door neighbors. The man in the automobile observes his friend strolling along the street and wishes to say hello. He slows down, honks his horn, rolls down the window, sticks out his head, and shouts across the street. More likely than not the pedestrian will be startled or annoyed by the sound of the horn. He looks around to see what's the matter and tries to recognize who can be yelling at him across the way. "Can you come to dinner Saturday night?" the driver calls out over the street noise. "What?" the pedestrian replies, straining to understand. At that moment another car to the rear begins honking to break up the temporary traffic jam. Unable to say anything more, the driver moves on.
What we see here is an automobile collision of sorts, although not one that causes bodily injury. It is a collision between the world of the driver and that of the pedestrian. The attempt to extend a greeting and invitation, ordinarily a simple gesture, is complicated by the presence of a technological device and its standard operating conditions. The communication between the two men is shaped by an incompatibility of the form of locomotion known as walking and a much newer one, automobile driving. In cities such as Los Angeles, where the physical landscape and prevailing social habits assume everyone drives a car, the simple act of walking can be cause for alarm. The U. S. Supreme Court decided one case involving a young man who enjoyed taking long walks late at night through the streets of San Diego and was repeatedly arrested by police as a suspicious character. The Court decided in favor of the pedestrian, noting that he had not been engaged in burglary or any other illegal act. Merely traveling by foot is not yet a crime.
Knowing how automobiles are made, how they operate, and how they are used and knowing about traffic laws and urban transportation policies does little to help us understand how automobiles affect the texture of modern life. In such cases a strictly instrumental/functional understanding fails us badly. What is needed is an interpretation of the ways, both obvious and subtle, in which everyday life is transformed by the mediating role of technical devices. In hindsight the situation is clear to everyone. Individual habits, perceptions, concepts of self, ideas of space and time, social relationships, and moral and political boundaries have all been powerfully restructured in the course of modern technological development. What is fascinating about this process is that societies involved in it have quickly altered some of the fundamental terms of human life without appearing to do so. Vast transformations in the structure of our common world have been undertaken with little attention to what those alterations mean. Judgments about technology have been made on narrow grounds, paying attention to such matters as whether a new device serves a particular need, performs more efficiently than its predecessor, makes a profit, or provides a convenient service. Only later does the broader significance of the choice become clear, typically as a series of surprising "side effects" or "secondary consequences." But it seems characteristic of our culture's involvement with technology that we are seldom inclined to examine, discuss, or judge pending innovations with broad, keen awareness of what those changes mean. In the technical realm we repeatedly enter into a series of social contracts, the terms of which are revealed only after the signing.
Excerpted from The WHALE and the REACTOR by LANGDON WINNER Copyright © 1986 by The University of Chicago. Excerpted by permission.
All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.
Excerpts are provided by Dial-A-Book Inc. solely for the personal use of visitors to this web site.
I. A Philosophy of Technology
1. Technologies as Forms of Life
2. Do Artifacts Have Politics?
3. Techne and Politeia
II. Technology: Reform and Revolution
4. Building the Better Mousetrap
5. Decentralization Clarified
III. Excess and Limit
7. The State of Nature Revisited
8. On Not Hitting the Tar-Baby
9. Brandy, Cigars and Human Values
10. The Whale and the Reactor Notes Index