- Shopping Bag ( 0 items )
The threat is not the direct one once framed by the idea of insane robots or runaway mainframes usurping human functions for their own purposes, but the gradual loss of control over hardware, software, and function through networks of interconnection and dependence. What Rochlin calls the computer trap has four parts: the lure, the snare, the costs, and the long-term consequences. The lure is obvious: the promise of ever more powerful and adaptable tools with simpler and more human-centered interfaces. The snare is what usually ensues. Once heavily invested in the use of computers to perform central tasks, organizations and individuals alike are committed to new capacities and potentials, whether they eventually find them rewarding or not. The varied costs include a dependency on the manufacturers of hardware and software—and a seemingly pathological scramble to keep up with an incredible rate of sometimes unnecessary technological change. Finally, a lack of redundancy and an incredible speed of response make human intervention or control difficult at best when (and not if) something goes wrong. As Rochlin points out, this is particularly true for those systems whose interconnections and mechanisms are so deeply concealed in the computers that no human being fully understands them.
The complete text of Trapped in the Net is available online at http://pup.princeton.edu
"In Trapped in the Net, an insightful and painstakingly documented book, [Rochlin] explores the changes already wrought by computers and networking in areas as diverse as financial markets, air travel, nuclear power plants, corporate management and the military."—Lawrence Hunter, The New York Times Book Review
"Trapped in the Net covers not only the military, but also financial markets, aviation and business. In all cases, humans working inside organizations become helpless just when the systems they use encounter the unexpected and start behaving idiotically. This is a fascinating and well-argued book. . . . The references are good, and certainly prove that Rochlin is not a lone voice with a cynical message."—Harold Thimbleby, New Scientist
"[Rochlin's] straightforward argument should be apparent to those managing and promoting increasing computerization: that greater dependence on computers implies greater disaster when they fail. . . . Rochlin ends with an exploration of the new cyberized military and continues to pinpoint the unintended consequences that computer enthusiasts rarely think about, but should."—Booklist
". . . computerization is leading us into pretty dire straits. In financial markets, warp-speed automated trading creates opportunities for fraud and moves us further away from a stable investment climate. In the office, computers promise efficiency, but bring fragmented knowledge and reduced autonomy to workers. There's worse news. Pilots in the 'glass cockpits' of modern airplanes have too much data to interpret, and nuclear power plant operators are less likely to have an intuitive feel for things going wrong 'on the floor'. Most sobering of all is the discussion of automation and the military."—Publishers Weekly
1 Introduction 3
Enter the Computer 5
Compliance and Control 7
The Structure of the Argument 11
The Structure of the Book 13
2 Autogamous Technology 15
A Brief Historical Essay 16
Operating Systems 23
The Dynamics of Growth 29
The Hegemony of Design 32
3 Networks of Connectivity: Webs of Dependence 35
From Anarchy to Networks 38
The Interconnected Office 46
4 Taylorism Redux? 51
The Search for Managerial Control 53
The Deskilling Controversy 61
Expertise Lost 67
Heterogeneous Systems 69
5 Computer Trading 74
Markets and Exchanges 76
Automating Markets 82
6 Jacking into the Market 91
The Demise of Barings P L C 91
Trading in Cyberspace 94
Global Markets 99
7 Expert Operators and Critical Tasks 108
Having the Bubble 108
Pilot Error 112
The Glass Cockpit 115
Air Traffic Control 119
Industrial and Other Operations 123
The Computer in the Loop 125
8 Smart Weapons, Smart Soldiers 131
Industrial War 132
Techno-Industrial War 135
The Postwar Transition 137
Quantity versus Quality 140
Trading Tooth for Tail 144
9 Unfriendly Fire 150
A "Reasonable Choice of Disaster" 152
The USS Stark 154
Tragedy over the Persian Gulf 156
10 The Logistics of Techno-War 169
The Gulf War 171
Redefining Effectiveness 182
Computers and the Transformation of War 184
11 C3I in Cyberspace
The Ways and Means of Modern Warfare 191
Moving toward Cyberspace 199
The Virtual Battlefield 202
12 Invisible Idiots 210
Standardization and Slack 212
Virtual Organizations in a Real World 214
This morning I got a call from a computer. The local telephone company had just repaired a defective line, and its computer was calling me to ask how satisfied I had been with the service. Somewhat amused by the role reversal, I dutifully punched the buttons of my touch-tone phone when requested, evaluating the promptness, efficiency, and quality of the work done. Only after I hung up did I realize that the reversal of roles had only been symbolic. It didn't matter whether I called the computer or it called me. In either case, I have learned to adapt my behavior to comply with the electronic menu, to conform to the specifications of a machine.
As the electronic digital computer and the networks it supports become ever more deeply embedded as constituent elements of life in modern industrial societies, stories about the frustrations and problems of dealing with computers, from automated voice mail to airline reservation systems, have become increasingly common. But even when really amusing, such stories generally deal with the roughness of the human-machine interface and the inherent inability of preprogrammed, automated systems, however clever, to deal effectively with the variety and unpredictability of human beings.
There are other, more consequential stories that are hardly ever told. When my travel agent was unable to book a flight that I wanted because her flexibility and range of choice were subordinated to the nationwide Sabre system, and the local airline office could not because it was blocked by arbitrary but firmly programmed rules, I was able to do so on my own by calling the frequent flier desk of the airline and speaking with a young woman to whom the booking computer was an accessory and an aid, not a confining and superordinating system.
When the library at my university installed a bar-coded checkout system for books, they also put into place a supermarket-like inventory control system that automatically identifies books not recently checked out and sends them to storage. But university libraries are not supermarkets, university collections are not used in the same way as public libraries, and not all scholarly fields are equally time-bound, or heavily populated. One of the unintended consequences of the new system is that many important books in the more leisurely or esoteric fields of traditional scholarship e.g., medieval studies were moved to a remote warehouse, where access was difficult and the chance of finding them by walking the shelves remote.
When my wife could not find a particular journal series on medieval English indexed on the library's elaborate computer system, she was told that the index was purchased from a commercial supplier, who tended to correlate the depth and detail of indexing with the laws of supply and demand. This privileges users in engineering and the sciences over scholars of literature and history, whose demand is smaller and less coherent in space or time.
These examples illustrate the long-term and indirect costs of preprogrammed automation and computerized efficiency. The more powerful a data management program, the greater the requirement that data be entered in certain specific and structured ways; what does not fit must be reshaped or discarded. The more structured the data entry, the more confining the rules and possibilities for searching. The larger and more established the database and its rules, the more difficult to modify or extend them. Eventually, the machine's rules reign.
The adoption of computers to organize, manage, integrate, and coordinate a wide variety of human activities has greatly augmented human capabilities and increased the scope and connectivity of human activities. But at what cost to resiliency and adaptability? Office networking allows considerable interactive flexibility, as do market transfer systems, electronic banking, and the Internet, but the requirements for compliance and strict adherence to standards and protocols are stringent. Just-in-time industrial systems offer great flexibility in manufacturing, but, as was discovered in the recent Kobe earthquake, they are essentially inoperable if the electronic system that coordinates and schedules the required network of tightly coupled activities is damaged or destroyed.
Enter the Computer
The argument of this book is that the complacent acceptance of the desktop "personal" computer in almost every aspect of modern life is masking the degree to which computerization and computer networking are transforming not just the activities and instruments of human affairs, but also their structure and practice. As they become familiar, indeed ubiquitous components of appliances, communication, work processes, organization, and management, computers are increasingly regarded as no more than exceedingly capable and complex tools. And humans seem always to regard what they have made as something that they can therefore control. That our history has been shaped by the form and use of our tools in ways totally unanticipated by their inventors is, as always, conveniently forgotten.
It is not surprising that the ordinary person hardly pauses to reflect on the rapidity and scope of the transformation of social structures and culture that are resulting from the widespread acceptance of the digital computer. There seems to be little doubt about the value of collecting and indexing more information than any one of us could possibly scan, let alone digest, in a lifetime; of instant and virtually unimpeded global communication; or of automating difficult and complex industrial operations. Modern societies have made a totem of their hardware, cloaking dependence by transforming what it is necessary to own into what it is merely desirable to have, and disguising the imperatives of compliance as no more than a set of operating rules.
As the computer has passed from novelty to ubiquity, one of the most identifiable characteristics of its reconstruction of the human world has been a flood of accompanying propaganda that verges on adoration. Newsstands are filled with magazines tutoring us on the latest trend in software, the most appropriate hardware, the latest modes of interconnection, while newspapers report with breathless earnestness the new levels of empowerment to be reached with the technological breakthrough of the week. Their idols, and perhaps ours, are the visionaries of Intel, of Xerox PARC and Apple, of Lotus and the Internet, who pulled the digital computer out of their technological temples and onto the desktop. And floating above it all, you find the techno-metaphysicians in search of a larger, more profound meaning to it all: the Tofflers and their "third wave" and the "informatic society," Bruce Mazlish and the "fourth discontinuity," and the wizards of the world of artificial intelligence in peripatetic pursuit of a machine "who" really thinks.
The remarkable clockwork automata of the eighteenth century were capable of inspiring a fascination bordering on awe in a mechanical age when even the human body was regarded as perhaps little more than an elaborate machine. And in a later age of electronics and information, so were the huge mainframes of the early days of computing--remote, forbidding, and, perhaps, capable of taking over human societies through their superior intelligence and calculated rationality.
The early fear of computers was focused on the idea of large, discrete, immensely powerful thinking mechanisms, run by mysterious engineers in white coats, capable of becoming autonomous decision makers if not closely watched and supervised, and possibly of acting through our centralized and hierarchical institutions to take control of human affairs. What was not anticipated was that simplified robots would be employed everywhere as standard production machines, and that smaller, cheaper, more flexible, and more adaptable electronic digital computers--more powerful and more intricately networked than their inventors could ever have imagined--would be common tools in every business and office, moving into almost every school, and familiar, if not present, in almost every home.
Large, centralized computers are now on their way to becoming specialized rarities, and it is the movement of smaller, more adaptable, and far less expensive "computers" onto desktops and into increasingly smart "machines" seemingly harmless and hardly remarked upon that is becoming the agent of transformation, altering not only the range and scope of decisions and choices but the methods and processes by which they are made.
Of even greater importance is the rapid growth of interconnecting networks of communication and information, created by the new capabilities and supported by them, which are bypassing both the structure of centralized institutions and the controls, internal and external, that developed to direct, manage, and regulate them.
What makes the process so elusive to characterize and difficult to analyze is that the conquest of human decision processes and procedures is taking place through the transformation of the means and representation of interaction rather than through the more direct and potentially adversarial processes of displacement of authority or assertion of control. Indeed, some of the harshest critics of traditional industrial societies, of the "modernist" vision of industrialization and development, are found among the enthusiasts of modem computerization and networking, arguing that individual computing is not just a useful, but a necessary resource, an indispensable tool not just for dealing with the forbidding complexity of modern society but also as a means for gaining access to the explosive growth in human knowledge.
The consequences of the increased social reliance, and, in many cases, dependence, on computerized information systems, computerized data processing, and computer-aided decision making are therefore likely to be neither direct nor obvious. There is no sign that computers are, in the sense of directly controlling our lives, "taking over" the conduct of human affairs either autonomously or as agents of human organization. Instead, they are creating patterns of reliance and dependency through which our lives will be indirectly and irrevocably reshaped.
Compliance and Control
For the greater part of this century, the search for efficiency and rational organization of space and time was the essence of modernity. Synchronicity, the rational ordering and coordination of work by planning and authority, was to be established through the planning of work and the formal centralization and hierarchical ordering both of the workplace itself and of its management and administration. The gains in efficiency from coordinated production and standardization and rationalization of the workplace were set against the presumably wasteful and disorganized artisanal and small-shop experience of the past. Efficiency losses caused by alienation, deskilling, and lack of flexibility and adaptability were by comparison judged to be small. The first, mainframe computers were admirably suited to that environment and, if anything, increased the centralization of administration, management, and control.
The introduction of the small digital computer, whether as a dedicated mid-frame for business or as desktop personal computers and individual workstations, was promoted as a form of electronic liberation from the powerful and centralized computer center. One of the most persistent arguments for social benefits of the introduction of computers and computer-aided machinery revolves around the argument that the personal computer is an instrument of social and perhaps political democratization, a means for providing a flexible, adaptable work environment more like the historical assemblage of craft workers that made up early factories than the mega-production lines of mid-century.
But historical artisan and craft production took place in an environment in which information was difficult to obtain and slow to propagate. The introduction of the computer into business and industry also provided the firm with far greater capacity to gather, order, and disseminate information, almost without practical limit. The era of the free-standing personal workstation lasted only a few years before the usual arguments for efficiency, coordination, and synchronization led to their interconnection, creating networks whose connectivity and flexibility far overshadow the simple hierarchical structure of the mainframe. As a result, it is now possible to control and coordinate process and production without imposing the static and mechanized form of organization of workplace and administration that so characterized the synchronistic approach.
What the computer transformation of business and industry has done is to maintain the appearance of continuing the trend toward decentralization, to further reduce the visible hierarchy and formal structures of authoritarian control while effectively and structurally reversing it. Instead of the traditional means of formalization, fixed and orderly rules, procedures, and regulations, the modern firm uses its authority over information and network communications to put into place an embedded spider web of control that is as rigorous and demanding as the more traditional and visible hierarchy. Because of its power and flexibility, the new control mechanism can afford to encourage "empowerment" of the individual, to allow more individual discretion and freedom of action at the work site, and still retain the power to enforce the adjustments that ensure the efficiency of the system as a whole.
If democracy truly depends upon free and ready access to information and unfettered interpersonal communication, the social effects of recent developments in computer networks have indeed been democratizing, in the sense of empowering individuals to make better judgments and arrive at better decisions--at least for those with sufficient training, experience, and education to use them for social development, interaction, and personal growth rather than for conversation and entertainment. But the unit of analysis for this argument is the individual, and the social context is one in which the individual has the power to put the information, and the communication, to effective use.
If democracy is instead defined in terms of power, of the balance between individual autonomy and centralized coordination, the results are at best mixed. In the personal and traditional public political realms, the computer is a potentially useful tool, both for avoiding coercion or deception by others seeking power and for enabling better action by groups as well as individuals. But the myth of democratizing technologies developed first and foremost for the introduction of "intelligent" machines into the workplace has been applied more forcefully for the introduction of computers than for any other historical case. Promoters argue that both workers and managers will have better information, more autonomy, more flexibility, and greater control.
This is neither a unique claim nor a modern one. The historical record of the introduction of new techniques, and new technical systems, into factories, offices, and other workplaces is full of parallels that suggest that the democratizing claim is frequently made, and the democratizing effect does indeed frequently manifest itself during the early phases of introduction. But the democratizing phase is just that, a transient phase in the evolution of the introduction of new technology that eventually gives way to a more stable configuration in which workers and managers find their discretion reduced, their autonomy more constrained rather than less, their knowledge more fragmented, and their work load increased--with an added burden of acquiring and maintaining a larger and more complex body of required knowledge.
Whatever the initial distribution of power or authority, redistribution rarely diffuses any of it to those at the bottom of the organizational hierarchy. The price of increased worker autonomy has always been either to contract and forego the protective net of insurance and retirement that support regular employees, or to accept new rules that strictly bound and shape the range and character of the new domain of "autonomous" behavior. As computerization penetrates business and other organizations, those rules and bounds are increasingly imposed by technical rather than operational cadres, by middle managers more skilled in computers than in practice and by designers external to the firm.
Those who are trying to defend the boundaries of human expertise and integrative judgment in operation and process are increasingly challenged by a new breed of neo-Taylorists who seek to automate everything in sight in the name of reliability, efficiency, and progress. But because it is difficult in an era that equates rationality with calculation to defend those apparently nonfunctional parts of the work experience from which expertise is synthesized, or the types of indirect and difficult-to-quantify inputs that create and sustain cognitive integration, the outcome of introducing computers into a complex task environment cannot be fully predicted in advance and is not really well and completely understood.
When and if they are carefully and sensitively introduced, computers can greatly augment the use of human capabilities. They can perform partial integration of data, simplifying the display of information. They can master and simplify many kinds of complexity and deal with them routinely. They can also release operators, pilots, market analysts, and managers from the time-consuming chores of daily, repetitive tasks, freeing their time to deal with more challenging and pressing matters. If such uses can remain stable over time, people will eventually learn to incorporate them into their cognitive and representational frames and become comfortable, even expert, in the new working or social environment.
But the nature of computers, and of their use, makes it unlikely that the environment will remain stable, particularly as computers become more deeply embedded into the structure of operations and decision making. Computers not only create new potentialities and new options; they also allow for the creation and maintenance of new linkages and new domain boundaries, which in turn create new modes and degrees of interconnectivity and response that create new sets of problems. The complexity and demands both of immediate tasks and of their linkage to other tasks increases, as does the interaction with social and political environments that give them meaning and structure. Moreover, because computers are inherent promoters of efficiency, and because the people who run the organizations and activities in question are constantly seeking to improve it, the tendency to tighten coupling, to reduce slack and time of response, also tends to increase.
These effects are almost certain to increase the difficulty of acquiring experiential expertise by complicating and confusing the processes of cognitive integration by which difficult and complex tasks have always been managed. If present and recent past experience is any guide, one response to the increasing difficulty of human control will be attempts to insert computers even more deeply and thoroughly into the process, thus creating a reinforcing feedback loop.
At first, such trends were highly visible, and in some cases they could be arrested, reversed, or at least examined. But the more deeply the computers become embedded, the more they shape and frame the representation of the work to be performed, the more subtle and hard to identify these effects will become.
The Structure of the Argument
I began the inquiry that led to this book with a direct question: Are the computers taking over? As I pursued it, it became clear that the answer, or answers, are far more complex and indirect than I first realized. If the question is put in the classic and traditional sense of loss of human autonomy and technical determinism, the answer is certainly no. Computers are not intelligent, they are not autonomous, and they would not know how to run human affairs even if they could comprehend them.
However, in the subtle and more indirect sense of having infiltrated a wide range of human activities in ways that we have not appreciated, and with consequences we have yet to understand, the answer may well be, yes. They have already taken over in communications, in finance and banking, in the military, in the cockpit, and, increasingly, even in industry. But taking over in this context is measured in terms of growing dependency, growing vulnerability, and, more importantly, by socially constructed adaptations and responses that make it difficult to imagine the shape of modern society without them.
Whether what is occurring is or is not "revolutionary" in its implications, it is clearly of considerable social and political importance because of the cumulative, and probably irreversible, reconstructions of social organizations, social formations, and even social systems that are taking place. What I argue is that the changes are being driven by short-term goals for presumptive short-term benefits, balanced against short-term costs, with little guidance or understanding of the long-term costs and consequences.
The argument proceeds on three levels, interwoven to various degrees through the chapters that follow. The first is concerned with the direct effects of the introduction of computers into communication, information, and networking, areas where the technical progress of the past few years would simply have been impossible without the introduction of computers as constituent elements. It deals primarily with the origins and means of changes in capability and connectivity.
The second level explores the ways in which the new technical capacities and opportunities have interacted with the social and organizational environments in which they have been put to use to socially construct new organizational forms, and new types of behavior, while eroding or deconstructing others of long standing, thereby transforming structure as well as function. Just as the mechanization of work transformed its meaning and representation, moving the worker from the producer of goods or services to being the operator of production machinery, the process of computerization is causing one more step of removal, from being the operator of a machine or process to being the controller and manager of the computer that actually operates the machine or controls the process. It is, in the deepest sense, a reconfiguration of the representation of work, whose consequences are still being explored.
The third level is an exploration of longer-term systemic and societal implications, following the question of loss of expertise and experiential knowledge from the factory floor to the cockpit of a commercial airliner, and from the floor of the stock exchange to the command of military forces. The last two are particularly interesting because they stand at the verge of virtual reality, of conflict in "cyberspace," a universe defined and bounded by interactive electronics. The cyberspace the military is exploring is that of electronic warfare, surveillance, and control, in an environment where failures and errors are very consequential in the very real world. That of the stock market is one of computerized interaction and trading in concepts and symbols rather than rockets and missiles, but can be equally consequential in its own do main.
The Structure of the Book
Following a brief introductory chapter, chapters 2 and 3 review the history of personal computing and of interactive networking, with a particular emphasis on social formations and embedded purpose, leading to a consideration in chapter 4 of the implications for the emergence of new modes of organizational control, indirect and diffused instead of direct and centralized, but with overtones of Taylorism and scientific management nonetheless.
Chapters 5 and 6 explore the world of financial markets and computerized trading, which has been penetrated most rapidly by computerization and seems closer to the promise of virtual reality than any other economic sector. Chapter 7 pursues further the problem of loss of expertise, particularly in safety-critical applications such as air traffic control and airliner cockpits.
The next four chapters explore computerization in another leading-edge technical sector, militaries and military technologies. Chapter 8 sets out in brief the history of military computerization, while chapters 9 and 10 identify some of the problems that have been found in the field and relate them to similar problems that might occur in civil society if and when it becomes similarly committed to computerization of operation and control. Chapter 11 is a reflection on the movement toward virtual organizations, using as a case study the military's pursuit of war in cyberspace.
Taken together, these chapters paint a picture of computerization that is spreading rapidly in the large, that promises distributive benefits while insisting that costs are local and capturable. Computers are becoming accepted not just as useful, but as necessary, not as exotic and remote objects, but as the appliances of everyday life. What results is a structural paradox. The lack of concern over whether computers are "taking over," the acceptance and embedding of the computer as a constitutive technical element of social as well as industrial systems, seems also to imply the casual and often uncritical acceptance of structural and representational transformations that follow.
In The Second Self, Sherry Turkle provides a revealing example of her supplying children with a smart "Say it" machine that was programmed to complete a ten-word playback cycle even after the machine had been turned off. The children were not amused. As Turkle points out: "The `Say it' bug contradicts our most basic expectations of a machine. When you turn the switch to `Off,' machines stop. The cliche response to people's fears about computers `taking over' is that you can always `pull the plug.'"
But can we? As computers become more deeply embedded, as an increasing number of social functions and activities come to depend structurally as well as functionally on the power of computers, there is no way that we can pull the plug, at least not without incurring substantial costs.
This growing and irreversible commitment to the computer as an essential building block of modern society is not limited to the specialized uses and users familiar to the case studies of this book and so many others. Deeply embedded computerization is not limited to the military, or to specialized industrial enterprises or financial markets. It is already difficult to imagine American society without computerization; soon it will be nearly impossible. And when new representations of what it means to be and act in society are constructed, the computers will be built right in.
If this results in highly complex, safety-critical systems that fail in unforeseen ways; markets whose movements seem arbitrary and uncontrollable; businesses in which authority is spread everywhere but responsibility localized nowhere; military systems that demand compliance with preprogrammed assumptions to fight effectively; and organizations and systems of every type that can no longer function effectively if the computer or network goes down, we will indeed be caught in the computer trap.