Computer Ethics / Edition 4

Paperback (Print)
Rent
Rent from BN.com
$19.68
(Save 73%)
Est. Return Date: 09/21/2014
Buy Used
Buy Used from BN.com
$43.61
(Save 39%)
Item is in good condition but packaging may have signs of shelf wear/aging or torn packaging.
Condition: Used – Good details
Used and New from Other Sellers
Used and New from Other Sellers
from $19.49
Usually ships in 1-2 business days
(Save 72%)
Other sellers (Paperback)
  • All (22) from $19.49   
  • New (7) from $56.54   
  • Used (15) from $19.49   

Overview

Written in clear, accessible prose, the Fourth edition of Computer Ethics brings together philosophy, law, and technology. The text provides an in-depth exploration and analysis of a broad range of topics regarding the ethical implications of widespread use of computer technology. The approach is normative while also exposing the student to alternative ethical stances.

Read More Show Less

Product Details

  • ISBN-13: 9780131112414
  • Publisher: Pearson
  • Publication date: 1/5/2009
  • Series: Alternative eText Formats Series
  • Edition description: Revised
  • Edition number: 4
  • Pages: 216
  • Sales rank: 667,493
  • Product dimensions: 6.00 (w) x 8.90 (h) x 0.50 (d)

Meet the Author

Deborah G. Johnson is the Anne Shirley Carter Olsson Professor of Applied Ethics and Chair of the Department of Science, Technology, and Society at the University of Virginia. Johnson has devoted her career to understanding the connections between ethics and technology. She received the John Barwise prize from the American Philosophical Association in 2004; the Sterling Olmsted Award from the Liberal Education Division of the American Society for Engineering Education in 2001; and the ACM SIGCAS Making a Difference Award in 2000.

Keith W. Miller is a Professor of Computer Science at the University of Illinois at Springfield. His work in software engineering and computer ethics provide complementary perspectives to questions that challenge computer professionals. He is the editor-in-chief of IEEE Technology and Society, and helped develop the Software Engineering Code of Ethics and Professional Practice. He was named a University of Illinois Scholar in 2000 and received the ACM SIGCAS Outstanding Service Award in 2006.

Read More Show Less

Read an Excerpt

PREFACE:

PREFACE

With the publication of the third edition of Computer Ethics, I am reminded of the day in 1984 when I received the page-proofs of the first edition. I had just returned home from the hospital after having given birth to my daughter. I had composed the book on an Osborne computer using a word processor—I think it was called WordStar—that has been obsolete for more than 10 years now. Today my daughter, now a teenager, is more comfortable with computers than I am. She spends a good deal of her day sitting in front of a computer screen chatting with friends, doing schoolwork, and exploring the Web. I composed this edition of the book on a laptop computer using a version of MS Word that automatically corrected my misspellings and grammar. And, of course, in writing this edition of the book, I frequently went to the Web to look for resources and check references. While I continue to be cautious in making grand pronouncements about the significance of these technological changes for the quality and character of human lives, the changes that have taken place in these 16 years are awe-inspiring.

As I began writing this edition, it was strikingly clear that my primary task was to address the technological changes that have occurred since the second edition, especially the growth and penetration of the Internet into so many domains of life. What are we to make of Web sites, cookies, data mining tools, customized online services, and e-commerce? I have addressed many of these new issues while at the same time holding on to what I continue to believe are the core issues in computer ethics: professional ethics, privacy, property,accountability, and social implications and values. Indeed, you will see that in Chapter 1, 1 continue to struggle with the question at the heart of the field, what is computer ethics? Are the ethical issues surrounding computers unique? What is the connection between ethics and technology?

Contemplating the connection between technology and ethics raises an interesting and important question: Does the field of computer ethics simply follow the development of computer technology? Should computer ethicists simply react to technological developments? Wouldn't it be better if the sequence were reversed so that technological development followed ethics? Historically, the field of computer ethics has been reactive to the technology. As I explain in Chapter 1, new technological developments create new possibilities and the new possibilities need to be evaluated. As in the last edition, I build on the idea in Jim Moor's seminal piece "What Is Computer Ethics?" (1985) that new technologies create policy vacuums. The task of computer ethics, he argues, is to fill these policy vacuums. In a sense, the ethical issues are the policy vacuums, and policy vacuums are created when there is a new development or use of computer technology.

On the other hand, I want to suggest that it would be better if at least some of the movement were in the other direction—technology following ethics. Suppose, that is, we lived in a world where ethicists (or anyone, for that matter) identified potentially unethical situations or arrangements or ethically better possibilities, and engineers and computer scientists went to work designing technologies to change or remedy or improve the situation. I can think of a few examples when this has occurred, but only a few. Arguably, privacy-enhancing technologies and anonymous re-mailers are cases in point. Perhaps freeware and shareware are also examples. For the most part, however, the ethical issues have followed, rather than led, the technology. Here in very broad brushstrokes is my understanding of the evolution of the field of computer ethics, especially in the United States.

HISTORICAL OVERVIEW

In the decades immediately following World War II, ethical concerns were raised about computers, though these concerns were only vaguely expressed and articulated. One of the most salient concerns was that computers threatened our notion of what it means to be human because computers could do the very thing that was considered unique to humans, rational thinking. There was much discussion of artificial intelligence. There was some fear (and fascination with the idea) that computers might take over decision making from humans. I am thinking here of the movie 2001 but the theme also ran through science fiction literature, for example, in Issac Asimov's short stories. Somewhat later, Jim Moor picked up on this theme and wrote an analytical article, "Are There Decisions That Computers Should Never Make?" (1979).

It could be argued that those very early concerns about computers were not exactly ethical in character. For example, no one explicitly argued that it was immoral to go forward with the development of computers because of the threat to our concept of human beings. And the science fiction literature did not suggest that it was immoral to turn over decision-making power to computers. Rather, the implicit argument seemed to be that there would be terrible consequences—possible catastrophes and degradation of human life—were decision making to be turned over to computers.

These concerns did not come from an effect arising from the use of computers; they arose from the mere idea of computers. The very idea of a technology that could think or do something very close to it was threatening to our understanding of what it means to be human.

Ironically, it could be argued that this idea, the idea that computers do what humans do, has turned out to be rich in its influence on human thinking about thinking, rather than a threat. The model of human thought that computers provide has spawned the thriving new field of cognitive science and changed a number of related disciplines. (See for example, Bynum and Moor, 1999.)

In the late 1970s, the ethical issues began to be more clearly articulated in the works of Joseph Weizenbaum (19'79) and Abbe Mowshowitz (1976), and it was in this period that the Privacy Protection Commission did a major study of privacy. The issues that took shape in this period had to do with the threat of big government and large-scale organizations, the related threat to privacy, and concern about the dominance of instrumental rationality. In hindsight, the concern about big government and privacy followed the technology in that in those early days, computers were being used extensively to create and maintain huge databases, databases of a variety of kinds, but especially databases of personal information. Computers were also being used for large numerical calculations. The large-scale calculations were primarily (though not exclusively) for government activities such as weapons development, space travel, and the U.S. census.

The next major technological shift was the development of small computers (microcomputers and personal computers). Attention turned, for a time at least, to the democratizing aspects of computers. Quietly, at the same time, remote access had come on the scene, first as remote access to large mainframes, later as a web of telecommunications connections between small computers.

Attention turned to software and the ethical issues surrounding it. The development and spread of microcomputers brought computer technology visibly and powerfully into the consumer marketplace. Software was recognized as something with enormous market value, and hence, all the ethical issues having to do with property arose. Should software be owned? If so, how? Would current intellectual property law provide adequate protection? Along with property rights issues came issues of liability and responsibility. In the marketplace, if consumers buy and use computers and software, they want to be able to rely on these tools and when something goes wrong, they want to know who to blame or they want to be compensated for their losses.

During this period, the market in computer games took off and it was also during this period that more attention began to focus on hackers. On the one hand, hackers were responding to the commercialization of computing. They did not like the idea of property rights in software. At the same time, those who were acquiring property rights and/or making a business of computing saw the threat posed by hackers, a threat to property rights and to system security.

In the 1990s, attention turned to the Internet. The coming together of computers, telecommunications, and media was the next major development in the technology. The development and expanded use of the Internet brought a seemingly endless set of ethical issues as the Internet came to be used in so many different ways in so many different domains of life. In effect, we are now in a process of transferring and re-creating much of the world into this new medium. At the same time, the Internet also raised all the concerns of the past. Privacy issues are exacerbated on the Internet; the democracy issue came back into play with new claims about the Internet's democratic character; property rights expanded to Web sites and global property rights became ever more important; and so on.

One other technological development that grew slowly during the 1980s and 1990s was the use of computer technology for a wide variety of visualization activities—not just computer graphics and gaming, but simulation activities including medical imagining and scientific models. This development expanded into the idea of virtual reality, an idea that has captivated many. Very quietly and slowly, ethical concerns have been raised about this thrust of computer technology. Unfortunately, I have been able to give only cursory attention to virtual reality issues.

In summary, during the 1960s and 1970s the dominant uses of the technology were for database creation and large-scale calculations. These uses of the technology brought correlated expressions of concern about centralization of power and big government, and threats to personal privacy. During this time, the very idea of computers seemed to threaten the idea of what it means to be human. During the 1980s, microcomputers were developed and made readily available. Remote access to large mainframe computers also became possible. Quietly, the system of telecommunication lines linking computers, that later became the Internet, was expanding and being made available beyond the "inner circle" of developers. Also, the computer/video game industry began to take off. With these developments came correlative concerns about property rights, liability issues, and the threat posed by hackers. In the 1990s, the coming together of telecommunications and computers reached a pinnacle of development and the Internet and the World Wide Web (Web) became widely available. These technological developments are still being assimilated, but they gave rise to a seemingly endless array of ethical issues as well as exacerbating those that were already there.

This is a story of computer ethical issues following technological developments. The question remains whether this pattern is as it should be. As I suggested before, reversing the order would seem to have some advantages, though scholars in the field of computer ethics do not seem to recognize the possibility of leading rather than following the technology. A central focus on the topic of design of computer technology would go a long way toward reversing this pattern. If the designers of technology were to think about the ethical and social implications of their designs before they became a reality, wouldn't the world be a better place!

CHANGES IN THE THIRD EDITION

Readers who are familiar with earlier editions of Computer Ethics will note that in this edition I have added two chapters specifically focused on the Internet, Chapter 4 "Ethics Online" and Chapter 8 "Social Implications and Social Values." The addition of this new material led to other changes in the organization of the book. First, instead of having a separate chapter on crime, abuse, and hacker ethics, I have situated the discussion of hackers and hacker ethics in the first chapter on the Internet, Chapter 4. This placement recognizes that hacking is a phenomenon made possible by the combination of computers and telecommunications lines that we now call the Internet. In 1994 when the second edition was published, the Internet had already been created, but it was far from clear that it would become what it has. Second, instead of having one chapter on the social implications of computer and information technology and another on the social implications of the Internet, I have combined material on the social implications of computer technology from the second edition with new material on the Internet. While I discuss both, the primary focus of Chapter 8 is on the social implications of the Internet and especially its social implications for democracy. I found this approach useful for focusing discussion of the relationships between technology and social change and between values and technology.

As with previous editions, there are many possible paths a reader might take through the book. The topics from chapter to chapter are interconnected, but each chapter has been written to stand essentially alone. When used as a textbook, the path students take through the book should be determined by the type of students being taught, the length of the course, and other books and materials being used in the course. For example, when teaching a class of computer science majors, it is important that the chapter on professional ethics be read early on. This sets students up to think of the issues as part of their professional responsibility. When teaching nonmajors, this chapter can comfortably be read at the end of a course, and can be presented as a way of thinking about how some of the issues discussed in the book might be addressed—control who becomes a computer professional and give computer professionals more responsibility for the effects of their work.

As in the previous editions, I have started each chapter with a set of short scenarios. The scenarios are intended to entice the reader into the topic, to implicitly make the case for the importance of the topic, and to make the topic concrete for those who are impatient with theory. The cases also provide the content for teaching skills in ethical analysis. As before, I have provided study questions and suggested further reading at the end of each chapter.

OVERVIEW

Chapter 1: Introduction: Why Computer Ethics?
In Chapter 1, I make the case for the importance of computer ethics and I explore why computer and information technology raises ethical questions when many other technologies do not. Building on Moor's idea that the task of computer ethics is to fill policy vacuums, I describe generally how computer and information technology gives rise to ethical issues. I push further addressing how these issues can be resolved and explore the traditionalist account which specifies that we can extend ordinary moral principles to situations created by computer technology. This discussion prepares the way for asking in what ways computer ethical issues are unique and in what ways not. As in the last edition, I argue that it is useful to think of the ethical issues surrounding computer and information technology as new species of generic moral issues. I support this idea by arguing that while ethics is always about human action, technology instruments human action and technology makes it possible for individuals and institutions to behave in ways they couldn't behave without technology. Traditional ethics and ethical theories have largely ignored the instrumentation of human action. Computer ethics brings this unexplored area of ethics into focus. I conclude this chapter with a brief discussion of the virtues and dangers of using analogies in analyzing computer ethical issues.

Chapter 2: Philosophical Ethics
This chapter is largely as it was in the second edition though I have added brief descriptions of virtue ethics and John Rawls' theory of justice. As before, the aim of this chapter is to show that ethics is not just a subjective and relativistic enterprise. The aim is to show that ethics and ethical analysis involves giving reasons and making arguments for one's claims and subjecting those reasons and arguments to critical evaluation. By reviewing traditional ethical theories, the chapter provides readers with a useful vocabulary and conceptual tools for thinking through ethical issues. The chapter is not intended to provide the theoretical framework from which answers to all ethical questions can be deduced. Rather, the aim of this chapter is to suggest that ethical analysis is a dialectic process.

Chapter 3: Professional Ethics
The organization of this chapter and the ideas explored are fundamentally the same as in the last edition of Computer Ethics though I have tried to clarify the ideas further, and I have updated the chapter by addressing the issue of licensing of software engineers. I have also recognized recent changes to the ACM code of ethics. The chapter begins with a discussion of how becoming a member of a profession can lead to somewhat special moral rights and responsibilities. That analysis sets the scene for defining profession and professional, and for asking whether computing is a profession. This is followed by a brief discussion of software engineering licensing. The focus of the chapter then turns to the responsibilities of computer professionals, to employers, to clients, to the public, and to co-professionals, and how they come into conflict. The chapter ends with a brief discussion of professional codes of ethics.

Chapter 4: Ethics and the Internet: Ethics Online
I begin this chapter by identifying what is morally significant and distinct about the Internet. Focusing on the Internet as a medium of communication, what seems morally significant is the many-to-many global scope of the Internet, the availability of a certain kind of anonymity, and the reproducibility of the medium. After drawing out the implications of these features, emphasizing the difficulties of accountability and trust, I move on to discuss hacking and hacker ethics. Here I have included some of the material from the previous edition. I conclude the chapter with a discussion of the problems the Internet seems to pose for controlling socially undesirable behavior and for encouraging civil behavior.

Chapter 5: Privacy
Chapter 5 is a combination of old and new material. As in the previous edition, I begin by asking how computer and information technology has changed the collection and distribution of personal information. I describe the traditional way in which the privacy issue has been framed—as necessarily involving a trade-off between individual interests in controlling information and the efficiency and improved decision making of those who can make use of the information. I argue for reframing the issue in a way that recognizes personal privacy not just as an individual good but as a social good, and I try to make clear the importance of privacy for democracy. I conclude the chapter by discussing a variety of possible approaches to improving the protection of personal privacy.

Chapter 6: Property
In the years since I wrote the first edition of Computer Ethics, the property rights issues have gotten more and more complicated. While there is dissatisfaction with current law, in fact, the law has not changed fundamentally. I have come to the conclusion that the most useful approach for an introductory text of this kind is to stay with the fundamentals. Thus, this chapter is very similar to the property rights chapter in the second edition. I begin by describing the problem that ownership of software poses. I describe copyright, trade secrecy, and patent law and the inadequacies each has for protecting computer software. Digging deeper into the problem, I explore the philosophical basis for property rights looking first at the natural rights arguments and then at the utilitarian arguments for and against ownership. I conclude with an argument similar to the one I made in the second edition: Making an illegal copy of proprietary software is immoral because it is illegal. While it is immoral because it is illegal, there are other kinds of immorality that would be immoral even if they were legal. I conclude with a brief discussion of how the Internet is likely to exacerbate property rights issues.

Chapter 7: Accountability
Chapter 7 begins with scenarios that pose a wide range of accountability issues: responsibility for rape in a virtual reality game, accountability when software recommends a decision, liability of Internet Service Providers, and responsibility for the Y2K problem. A discussion of the different meaning and uses of terms such as responsibility, accountability, liability, and blame lays the groundwork for the chapter. The focus then turns to the legal environment for the buying and selling of computer and information technology where the distinction between selling a product and providing a service is pivotal. The remainder of the chapter is devoted to issues that are unique to computer and information technology, especially the diffusion of accountability, the Y2K problem, and Internet issues.

Chapter 8: Ethics and the Internet II: Social Implications and Social Values
Chapter 8 is the second chapter devoted to the Internet. Drawing on material from the second edition, I begin this chapter with a general discussion of technology and social change and identify the pitfalls in asking questions such as, Is computer and information technology causing a social revolution? Is it changing things or reinforcing the status quo? Is technology good or bad? Attention is then focused on values embedded in computer and information technology. I emphasize how value-laden technology is. The chapter turns, then, to the Internet and democracy. I examine the arguments that are made to show that the Internet is a democratic technology and I critique these arguments.

Computer and information technology and especially the Internet have been implicated in the widening gap between haves and have-nots within countries and among countries of the world. After examining this issue which has come to be known as "the digital divide," I briefly discuss the gender gap in computing, and then I turn briefly to the value of freedom of expression. I conclude this chapter by pointing to three issues that will be particularly important to watch in the future: jurisdiction, systems of trust, and insularity.

Practical Ethics
Finally it may be helpful to explain what I have and have not tried to do with regard to ethical theory. I have not aimed to provide "the" ethical theory from which all ethical beliefs should be derived. Nor have I tried to provide the only possible or adequate ethical analysis of any particular issue. There are both pedagogical and theoretical reasons for taking this stance.

Pedagogically, I believe it is essential for students and other readers to struggle with the cases, the issues, and the relevant moral concepts and theories. This is crucial to developing skill at ethical analysis and critical thinking, and to developing moral personhood. Hence, it would be somewhat counterproductive to present (what I claim to be) "the" final answers to all the ethical questions raised in this book. Instead I have left a good deal of room for further struggle with the issues. I have tried to present ethics as an ongoing, dialectic enterprise. My analyses are intended to move the discussion forward but not to end it.

Moreover, this book is an undertaking in practical ethics, and practical ethics is the middle ground where abstract ethical theories and concepts meet real-world problems and decisions. It takes an enormous amount of work to understand what theories mean for real-world situations, issues, and decisions, and in some sense, we don't understand theories until we understand what they imply about real-world situations. Practical ethics is best understood as the domain in which there is negotiation between theory and real-world situations. We draw on moral concepts and theories but we must interpret them and draw out their implications for the issues at hand. In practical ethics, we work both ways, from theory to context and from context to theory. Often a theory or several theories provide illumination on a practical matter; other times, struggle with the practical problem leads to new insight into a theory.

TERMINOLOGY

As in earlier editions, I use ethics and morality interchangeably. Many philosophers draw a distinction between the two, but I find such distinctions too far from conventional usage to be of help.

Since the publication of the previous edition, linguistic conventions around computer technology have changed. Computers have come to be understood as more than simply computational—computing—machines. Moreover, there area wide variety of computer-related tools for which it may or may not be appropriate to use the term computer. It was difficult to decide what terminology to adopt to refer to the technology on which this book focuses—information technology (IT), communication technology, information and communication technology (ICT), and so on. Indeed, it was tempting to change the name of the book to something like Information Ethics or IT Ethics, but, in the end, I have concluded that it is best to keep the name of the book the same, if for no other reason than that it is a known product. Wherever possible I use the (cumbersome) phrase computer and information technology rather than just computer technology. This more closely reflects current linguistic practice and understanding of the technology. In fact, the focus of the book may more accurately be described as a focus on a family of technologies that deal with a very wide variety of types of information (signals, data, images, words, etc.).

ACKNOWLEDGMENTS

A number of people helped me with this edition by reading chapters and providing comments and suggestions. I particularly want to thank Keith Miller, Fran Grodzinsky, Roberta Berry, Don Gotterbarn, Andy Ward, and Amy Bruckman for their feedback on various chapters. Marc Pang Quek was a wonderful graduate research assistant and wrote several scenarios based on real cases.

I owe much to many others who have helped me in a wide variety of ways. Scholarship is a collective enterprise. I have learned from reading the ideas of others, from listening and talking to students and colleagues. I am always encouraged by those who are willing to tell me what they like and don't like about my writing.

I will always be grateful to my colleagues in the Department of Science and Technology Studies at Rensselaer Polytechnic Institute. My years there shaped my thinking and my career in profound and permanent ways. In 1998, I moved to the School of Public Policy at Georgia Institute of Technology and I have been delighted to find another robust and lively interdisciplinary environment. My new colleagues and students in public policy and in other units of the Ivan Allen College at Georgia Tech have already enriched my thinking.

In writing this edition, I drew on several of my previously published articles. "Ethics Online" published in Communications of the ACM, a chapter entitled "Democratic Values" in Duncan Langford's book Ethics and the Internet, and "Is the GII a Democratic Technology?" presented at several conferences and then published in Computers and Society. I developed Chapter 1 from an unpublished paper that I wrote for the Tangled Web conference.

As I was completing a draft of this edition, Keith Miller, Laurie King, Tracy Camp, and I received a grant from the National Science Foundation's Program on Course, Curriculum and Laboratory Improvement, to develop materials and hold workshops focused on using the Web to teach computer ethics. The first workshop took place in June of 2000. During the workshop I received extremely valuable feedback on a draft of the book. Materials developed as part of the grant are available at: www.uis.edu/~miller/dolce/ and may be useful to teachers who use this book in their courses.

DEBORAH G. JOHNSON

Read More Show Less

Table of Contents

Contents

Preface vi

Acknowledgments viii

About the Authors viii

Chapter 1 Introduction to Sociotechnical Computer Ethics

Chapter Outline 1

Scenarios 2

1.1 A Virtual Rape 2 • 1.2 Surprises About Social Networking 3 • 1.3 RFID and Caring for the Elderly 4

Introduction: Why Computer Ethics? 5

The Standard Account 7

New Possibilities, a Vacuum of Policies, Conceptual Muddles 7 • An Update to the Standard Account 10

The Sociotechnical Systems Perspective 13

Reject Technological Determinism/Think Coshaping 13 • Reject Technology as Material Objects/Think Sociotechnical Systems 15 • Reject Technology as Neutral/Think Technology Infused with Value 17

Sociotechnical Computer Ethics 18

Micro- and Macro-Level Analysis 21

Return to the “Why Computer Ethics?” Question 21

Conclusion 22Study Questions 23

Chapter 2 Ethics and Information Technology 24

Chapter Outline 24

Introduction: “Doing” Ethics 25

Descriptive/Normative 26 • The Dialectic Method 28 • Ethics is Relative” 32

Ethical Theories and Concepts 35

Utilitarianism 35 • Intrinsic and Instrumental Value 36 • Acts versus Rules 38

Critique of Utilitarianism 39 • Case Illustration 41 • Deontological Theory 42 • Case Illustration 44 • Rights 46 • Rights and Social Contract Theory 47 • Virtue Ethics 48 • Analogical Reasoning in Computer Ethics 49

Conclusion 51Study Questions 51

Chapter 3 Ethics in IT-Configured Societies 53

Chapter Outline 53

Scenarios 54

3.1 Google in China: “Don’t Be Evil” 54 • 3.2 Turing Doesn’t Need to Know 553.3 Turnitin Dot Com 55

Introduction: IT-Configured Societies 55

Technology as the Instrumentation of Human Action 56

Cyborgs, Robots, and Humans 58

Three Features of IT-Configured Activities 60

Global, Many-to-Many Scope 61 Distinctive Identity Conditions 62 Reproducibility 65

IT-Configured Domains of Life 66

Virtuality, Avatars, and Role-Playing Games 66 Friendship and Social Networking 68 Education and Plagiarism Detection 70

Democracy and the Internet 72

What Is Democracy? 73 The Arguments 74 • Is the Internet a Democratic Technology? 76

Conclusion 79 Study Questions 79

Chapter 4 Information Flow, Privacy, and Surveillance 81

Chapter Outline 81

Scenarios 82

4.1 Email Privacy and Advertising 82 • 4.2 Workplace Spying: The Lidl Case 82• 4.3 Data Mining and e-Business 83

Introduction: Information Flow With and Without Information Technology 84

Why Care About Privacy? 86

No Need to Worry” 87 • The Importance of Privacy 90 • Privacy as an Individual Good 90 • Privacy as Contextual Integrity 93 Privacy as a Social Good Essential for Democracy 95 Autonomy, Democracy, and the Panoptic Gaze 96 Data Mining, Social Sorting, and Discrimination 98 Crude Categories 100 •Summary of the Arguments for Privacy and Against Surveillance 101

Is Privacy Over? Strategies for Shaping Personal Information Flow 101

Fair Information Practices 102 Transparency 104 Opt-In versus Opt-Out 104 •Design and Computer Professionals 105 •Personal Steps for All IT Users 106 •A Note on Privacy and Globalization 107

Conclusion 107 Study Questions 108

Chapter 5 Digital Intellectual Property 109

Chapter Outline 109

Scenarios 110

5.1 Obtaining Pirated Software Abroad 110 • 5.2 Free Software that Follows Proprietary Software 110 • 5.3 Using Public Domain Software in Proprietary Software 111

Introduction: The Complexities of Digital Property 111

Definitions 112 Setting the Stage 113

Protecting Property Rights in Software 114

Copyright 114 Trade Secrecy 118 Patent Protection 119

Free and Open Source Software 122

The Philosophical Basis of Property 124

Natural Rights Arguments 124 Critique of the Natural Rights Argument 125• A Natural Rights Argument Against Software Ownership 127

PS Versus FOSS 128

Is it Wrong to Copy Proprietary Software? 129

Breaking Rules, No Rules, and New Rules 133

Conclusion 135Study Questions 136

Chapter 6 Digital Order 137

Chapter Outline 137

Scenarios 137

6.1 Bot Roast 137 • 6.2Wiki Warfare 138 • 6.3Yahoo and Nazi Memorabilia 139

Introduction: Law and Order on the Internet 140

Sociotechnical Order 142

Online Crime 143

Hackers and the Hacker Ethic 145

Sociotechnical Security 150

Who Is to Blame in Security Breaches? 152 Trade-Offs in Security 153

Wikipedia: A New Order of Knowledge Production 154

Freedom of Expression and Censorship 156

John Stuart Mill and Freedom of Expression 157

Conclusion 160 • Study Questions 161

Chapter 7 Professional Ethics in Computing 162

Chapter Outline 162

Scenarios 163

7.1 Software Safety 163 • 7.2 Security in a Custom Database 164• 7.3 Conflict of Interest 164

Introduction: Why Professional Ethics? 165

Therac-25 and Malfunction 54 165

The Paradigm of Professions 167

Characteristics of Professions 168

Sorting Out Computing and its Status as a Profession 171

Mastery of Knowledge 171 Formal Organization 172 Autonomy 173 Codes of Ethics 174 The Culture of Computing 175

Software Engineering 176

Professional Relationships 178

Employer—Employee 178 Client—Professional 180 Other Stakeholders—Professional 182 Professional—Professional 183 Conflicting Responsibilities 184

A Legal Perspective on Professionalism in Computing 185

Licensing 185 Selling Software 186 Selling—Buying and the Categorical Imperative 187 Torts 188 Negligence 188

A Final Look at the State of the Profession 190

Guns-for-Hire or Professionals 190 Efficacy, Public Trust, and the Social Contract 191

Conclusion 192Study Questions 193

Websites 195

References 196

Index 198

Read More Show Less

Preface

PREFACE:

PREFACE

With the publication of the third edition of Computer Ethics, I am reminded of the day in 1984 when I received the page-proofs of the first edition. I had just returned home from the hospital after having given birth to my daughter. I had composed the book on an Osborne computer using a word processor—I think it was called WordStar—that has been obsolete for more than 10 years now. Today my daughter, now a teenager, is more comfortable with computers than I am. She spends a good deal of her day sitting in front of a computer screen chatting with friends, doing schoolwork, and exploring the Web. I composed this edition of the book on a laptop computer using a version of MS Word that automatically corrected my misspellings and grammar. And, of course, in writing this edition of the book, I frequently went to the Web to look for resources and check references. While I continue to be cautious in making grand pronouncements about the significance of these technological changes for the quality and character of human lives, the changes that have taken place in these 16 years are awe-inspiring.

As I began writing this edition, it was strikingly clear that my primary task was to address the technological changes that have occurred since the second edition, especially the growth and penetration of the Internet into so many domains of life. What are we to make of Web sites, cookies, data mining tools, customized online services, and e-commerce? I have addressed many of these new issues while at the same time holding on to what I continue to believe are the core issues in computer ethics: professional ethics, privacy,property,accountability, and social implications and values. Indeed, you will see that in Chapter 1, 1 continue to struggle with the question at the heart of the field, what is computer ethics? Are the ethical issues surrounding computers unique? What is the connection between ethics and technology?

Contemplating the connection between technology and ethics raises an interesting and important question: Does the field of computer ethics simply follow the development of computer technology? Should computer ethicists simply react to technological developments? Wouldn't it be better if the sequence were reversed so that technological development followed ethics? Historically, the field of computer ethics has been reactive to the technology. As I explain in Chapter 1, new technological developments create new possibilities and the new possibilities need to be evaluated. As in the last edition, I build on the idea in Jim Moor's seminal piece "What Is Computer Ethics?" (1985) that new technologies create policy vacuums. The task of computer ethics, he argues, is to fill these policy vacuums. In a sense, the ethical issues are the policy vacuums, and policy vacuums are created when there is a new development or use of computer technology.

On the other hand, I want to suggest that it would be better if at least some of the movement were in the other direction—technology following ethics. Suppose, that is, we lived in a world where ethicists (or anyone, for that matter) identified potentially unethical situations or arrangements or ethically better possibilities, and engineers and computer scientists went to work designing technologies to change or remedy or improve the situation. I can think of a few examples when this has occurred, but only a few. Arguably, privacy-enhancing technologies and anonymous re-mailers are cases in point. Perhaps freeware and shareware are also examples. For the most part, however, the ethical issues have followed, rather than led, the technology. Here in very broad brushstrokes is my understanding of the evolution of the field of computer ethics, especially in the United States.

HISTORICAL OVERVIEW

In the decades immediately following World War II, ethical concerns were raised about computers, though these concerns were only vaguely expressed and articulated. One of the most salient concerns was that computers threatened our notion of what it means to be human because computers could do the very thing that was considered unique to humans, rational thinking. There was much discussion of artificial intelligence. There was some fear (and fascination with the idea) that computers might take over decision making from humans. I am thinking here of the movie 2001 but the theme also ran through science fiction literature, for example, in Issac Asimov's short stories. Somewhat later, Jim Moor picked up on this theme and wrote an analytical article, "Are There Decisions That Computers Should Never Make?" (1979).

It could be argued that those very early concerns about computers were not exactly ethical in character. For example, no one explicitly argued that it was immoral to go forward with the development of computers because of the threat to our concept of human beings. And the science fiction literature did not suggest that it was immoral to turn over decision-making power to computers. Rather, the implicit argument seemed to be that there would be terrible consequences—possible catastrophes and degradation of human life—were decision making to be turned over to computers.

These concerns did not come from an effect arising from the use of computers; they arose from the mere idea of computers. The very idea of a technology that could think or do something very close to it was threatening to our understanding of what it means to be human.

Ironically, it could be argued that this idea, the idea that computers do what humans do, has turned out to be rich in its influence on human thinking about thinking, rather than a threat. The model of human thought that computers provide has spawned the thriving new field of cognitive science and changed a number of related disciplines. (See for example, Bynum and Moor, 1999.)

In the late 1970s, the ethical issues began to be more clearly articulated in the works of Joseph Weizenbaum (19'79) and Abbe Mowshowitz (1976), and it was in this period that the Privacy Protection Commission did a major study of privacy. The issues that took shape in this period had to do with the threat of big government and large-scale organizations, the related threat to privacy, and concern about the dominance of instrumental rationality. In hindsight, the concern about big government and privacy followed the technology in that in those early days, computers were being used extensively to create and maintain huge databases, databases of a variety of kinds, but especially databases of personal information. Computers were also being used for large numerical calculations. The large-scale calculations were primarily (though not exclusively) for government activities such as weapons development, space travel, and the U.S. census.

The next major technological shift was the development of small computers (microcomputers and personal computers). Attention turned, for a time at least, to the democratizing aspects of computers. Quietly, at the same time, remote access had come on the scene, first as remote access to large mainframes, later as a web of telecommunications connections between small computers.

Attention turned to software and the ethical issues surrounding it. The development and spread of microcomputers brought computer technology visibly and powerfully into the consumer marketplace. Software was recognized as something with enormous market value, and hence, all the ethical issues having to do with property arose. Should software be owned? If so, how? Would current intellectual property law provide adequate protection? Along with property rights issues came issues of liability and responsibility. In the marketplace, if consumers buy and use computers and software, they want to be able to rely on these tools and when something goes wrong, they want to know who to blame or they want to be compensated for their losses.

During this period, the market in computer games took off and it was also during this period that more attention began to focus on hackers. On the one hand, hackers were responding to the commercialization of computing. They did not like the idea of property rights in software. At the same time, those who were acquiring property rights and/or making a business of computing saw the threat posed by hackers, a threat to property rights and to system security.

In the 1990s, attention turned to the Internet. The coming together of computers, telecommunications, and media was the next major development in the technology. The development and expanded use of the Internet brought a seemingly endless set of ethical issues as the Internet came to be used in so many different ways in so many different domains of life. In effect, we are now in a process of transferring and re-creating much of the world into this new medium. At the same time, the Internet also raised all the concerns of the past. Privacy issues are exacerbated on the Internet; the democracy issue came back into play with new claims about the Internet's democratic character; property rights expanded to Web sites and global property rights became ever more important; and so on.

One other technological development that grew slowly during the 1980s and 1990s was the use of computer technology for a wide variety of visualization activities—not just computer graphics and gaming, but simulation activities including medical imagining and scientific models. This development expanded into the idea of virtual reality, an idea that has captivated many. Very quietly and slowly, ethical concerns have been raised about this thrust of computer technology. Unfortunately, I have been able to give only cursory attention to virtual reality issues.

In summary, during the 1960s and 1970s the dominant uses of the technology were for database creation and large-scale calculations. These uses of the technology brought correlated expressions of concern about centralization of power and big government, and threats to personal privacy. During this time, the very idea of computers seemed to threaten the idea of what it means to be human. During the 1980s, microcomputers were developed and made readily available. Remote access to large mainframe computers also became possible. Quietly, the system of telecommunication lines linking computers, that later became the Internet, was expanding and being made available beyond the "inner circle" of developers. Also, the computer/video game industry began to take off. With these developments came correlative concerns about property rights, liability issues, and the threat posed by hackers. In the 1990s, the coming together of telecommunications and computers reached a pinnacle of development and the Internet and the World Wide Web (Web) became widely available. These technological developments are still being assimilated, but they gave rise to a seemingly endless array of ethical issues as well as exacerbating those that were already there.

This is a story of computer ethical issues following technological developments. The question remains whether this pattern is as it should be. As I suggested before, reversing the order would seem to have some advantages, though scholars in the field of computer ethics do not seem to recognize the possibility of leading rather than following the technology. A central focus on the topic of design of computer technology would go a long way toward reversing this pattern. If the designers of technology were to think about the ethical and social implications of their designs before they became a reality, wouldn't the world be a better place!

CHANGES IN THE THIRD EDITION

Readers who are familiar with earlier editions of Computer Ethics will note that in this edition I have added two chapters specifically focused on the Internet, Chapter 4 "Ethics Online" and Chapter 8 "Social Implications and Social Values." The addition of this new material led to other changes in the organization of the book. First, instead of having a separate chapter on crime, abuse, and hacker ethics, I have situated the discussion of hackers and hacker ethics in the first chapter on the Internet, Chapter 4. This placement recognizes that hacking is a phenomenon made possible by the combination of computers and telecommunications lines that we now call the Internet. In 1994 when the second edition was published, the Internet had already been created, but it was far from clear that it would become what it has. Second, instead of having one chapter on the social implications of computer and information technology and another on the social implications of the Internet, I have combined material on the social implications of computer technology from the second edition with new material on the Internet. While I discuss both, the primary focus of Chapter 8 is on the social implications of the Internet and especially its social implications for democracy. I found this approach useful for focusing discussion of the relationships between technology and social change and between values and technology.

As with previous editions, there are many possible paths a reader might take through the book. The topics from chapter to chapter are interconnected, but each chapter has been written to stand essentially alone. When used as a textbook, the path students take through the book should be determined by the type of students being taught, the length of the course, and other books and materials being used in the course. For example, when teaching a class of computer science majors, it is important that the chapter on professional ethics be read early on. This sets students up to think of the issues as part of their professional responsibility. When teaching nonmajors, this chapter can comfortably be read at the end of a course, and can be presented as a way of thinking about how some of the issues discussed in the book might be addressed—control who becomes a computer professional and give computer professionals more responsibility for the effects of their work.

As in the previous editions, I have started each chapter with a set of short scenarios. The scenarios are intended to entice the reader into the topic, to implicitly make the case for the importance of the topic, and to make the topic concrete for those who are impatient with theory. The cases also provide the content for teaching skills in ethical analysis. As before, I have provided study questions and suggested further reading at the end of each chapter.

OVERVIEW

Chapter 1: Introduction: Why Computer Ethics?
In Chapter 1, I make the case for the importance of computer ethics and I explore why computer and information technology raises ethical questions when many other technologies do not. Building on Moor's idea that the task of computer ethics is to fill policy vacuums, I describe generally how computer and information technology gives rise to ethical issues. I push further addressing how these issues can be resolved and explore the traditionalist account which specifies that we can extend ordinary moral principles to situations created by computer technology. This discussion prepares the way for asking in what ways computer ethical issues are unique and in what ways not. As in the last edition, I argue that it is useful to think of the ethical issues surrounding computer and information technology as new species of generic moral issues. I support this idea by arguing that while ethics is always about human action, technology instruments human action and technology makes it possible for individuals and institutions to behave in ways they couldn't behave without technology. Traditional ethics and ethical theories have largely ignored the instrumentation of human action. Computer ethics brings this unexplored area of ethics into focus. I conclude this chapter with a brief discussion of the virtues and dangers of using analogies in analyzing computer ethical issues.

Chapter 2: Philosophical Ethics
This chapter is largely as it was in the second edition though I have added brief descriptions of virtue ethics and John Rawls' theory of justice. As before, the aim of this chapter is to show that ethics is not just a subjective and relativistic enterprise. The aim is to show that ethics and ethical analysis involves giving reasons and making arguments for one's claims and subjecting those reasons and arguments to critical evaluation. By reviewing traditional ethical theories, the chapter provides readers with a useful vocabulary and conceptual tools for thinking through ethical issues. The chapter is not intended to provide the theoretical framework from which answers to all ethical questions can be deduced. Rather, the aim of this chapter is to suggest that ethical analysis is a dialectic process.

Chapter 3: Professional Ethics
The organization of this chapter and the ideas explored are fundamentally the same as in the last edition of Computer Ethics though I have tried to clarify the ideas further, and I have updated the chapter by addressing the issue of licensing of software engineers. I have also recognized recent changes to the ACM code of ethics. The chapter begins with a discussion of how becoming a member of a profession can lead to somewhat special moral rights and responsibilities. That analysis sets the scene for defining profession and professional, and for asking whether computing is a profession. This is followed by a brief discussion of software engineering licensing. The focus of the chapter then turns to the responsibilities of computer professionals, to employers, to clients, to the public, and to co-professionals, and how they come into conflict. The chapter ends with a brief discussion of professional codes of ethics.

Chapter 4: Ethics and the Internet: Ethics Online
I begin this chapter by identifying what is morally significant and distinct about the Internet. Focusing on the Internet as a medium of communication, what seems morally significant is the many-to-many global scope of the Internet, the availability of a certain kind of anonymity, and the reproducibility of the medium. After drawing out the implications of these features, emphasizing the difficulties of accountability and trust, I move on to discuss hacking and hacker ethics. Here I have included some of the material from the previous edition. I conclude the chapter with a discussion of the problems the Internet seems to pose for controlling socially undesirable behavior and for encouraging civil behavior.

Chapter 5: Privacy
Chapter 5 is a combination of old and new material. As in the previous edition, I begin by asking how computer and information technology has changed the collection and distribution of personal information. I describe the traditional way in which the privacy issue has been framed—as necessarily involving a trade-off between individual interests in controlling information and the efficiency and improved decision making of those who can make use of the information. I argue for reframing the issue in a way that recognizes personal privacy not just as an individual good but as a social good, and I try to make clear the importance of privacy for democracy. I conclude the chapter by discussing a variety of possible approaches to improving the protection of personal privacy.

Chapter 6: Property
In the years since I wrote the first edition of Computer Ethics, the property rights issues have gotten more and more complicated. While there is dissatisfaction with current law, in fact, the law has not changed fundamentally. I have come to the conclusion that the most useful approach for an introductory text of this kind is to stay with the fundamentals. Thus, this chapter is very similar to the property rights chapter in the second edition. I begin by describing the problem that ownership of software poses. I describe copyright, trade secrecy, and patent law and the inadequacies each has for protecting computer software. Digging deeper into the problem, I explore the philosophical basis for property rights looking first at the natural rights arguments and then at the utilitarian arguments for and against ownership. I conclude with an argument similar to the one I made in the second edition: Making an illegal copy of proprietary software is immoral because it is illegal. While it is immoral because it is illegal, there are other kinds of immorality that would be immoral even if they were legal. I conclude with a brief discussion of how the Internet is likely to exacerbate property rights issues.

Chapter 7: Accountability
Chapter 7 begins with scenarios that pose a wide range of accountability issues: responsibility for rape in a virtual reality game, accountability when software recommends a decision, liability of Internet Service Providers, and responsibility for the Y2K problem. A discussion of the different meaning and uses of terms such as responsibility, accountability, liability, and blame lays the groundwork for the chapter. The focus then turns to the legal environment for the buying and selling of computer and information technology where the distinction between selling a product and providing a service is pivotal. The remainder of the chapter is devoted to issues that are unique to computer and information technology, especially the diffusion of accountability, the Y2K problem, and Internet issues.

Chapter 8: Ethics and the Internet II: Social Implications and Social Values
Chapter 8 is the second chapter devoted to the Internet. Drawing on material from the second edition, I begin this chapter with a general discussion of technology and social change and identify the pitfalls in asking questions such as, Is computer and information technology causing a social revolution? Is it changing things or reinforcing the status quo? Is technology good or bad? Attention is then focused on values embedded in computer and information technology. I emphasize how value-laden technology is. The chapter turns, then, to the Internet and democracy. I examine the arguments that are made to show that the Internet is a democratic technology and I critique these arguments.

Computer and information technology and especially the Internet have been implicated in the widening gap between haves and have-nots within countries and among countries of the world. After examining this issue which has come to be known as "the digital divide," I briefly discuss the gender gap in computing, and then I turn briefly to the value of freedom of expression. I conclude this chapter by pointing to three issues that will be particularly important to watch in the future: jurisdiction, systems of trust, and insularity.

Practical Ethics
Finally it may be helpful to explain what I have and have not tried to do with regard to ethical theory. I have not aimed to provide "the" ethical theory from which all ethical beliefs should be derived. Nor have I tried to provide the only possible or adequate ethical analysis of any particular issue. There are both pedagogical and theoretical reasons for taking this stance.

Pedagogically, I believe it is essential for students and other readers to struggle with the cases, the issues, and the relevant moral concepts and theories. This is crucial to developing skill at ethical analysis and critical thinking, and to developing moral personhood. Hence, it would be somewhat counterproductive to present (what I claim to be) "the" final answers to all the ethical questions raised in this book. Instead I have left a good deal of room for further struggle with the issues. I have tried to present ethics as an ongoing, dialectic enterprise. My analyses are intended to move the discussion forward but not to end it.

Moreover, this book is an undertaking in practical ethics, and practical ethics is the middle ground where abstract ethical theories and concepts meet real-world problems and decisions. It takes an enormous amount of work to understand what theories mean for real-world situations, issues, and decisions, and in some sense, we don't understand theories until we understand what they imply about real-world situations. Practical ethics is best understood as the domain in which there is negotiation between theory and real-world situations. We draw on moral concepts and theories but we must interpret them and draw out their implications for the issues at hand. In practical ethics, we work both ways, from theory to context and from context to theory. Often a theory or several theories provide illumination on a practical matter; other times, struggle with the practical problem leads to new insight into a theory.

TERMINOLOGY

As in earlier editions, I use ethics and morality interchangeably. Many philosophers draw a distinction between the two, but I find such distinctions too far from conventional usage to be of help.

Since the publication of the previous edition, linguistic conventions around computer technology have changed. Computers have come to be understood as more than simply computational—computing—machines. Moreover, there area wide variety of computer-related tools for which it may or may not be appropriate to use the term computer. It was difficult to decide what terminology to adopt to refer to the technology on which this book focuses—information technology (IT), communication technology, information and communication technology (ICT), and so on. Indeed, it was tempting to change the name of the book to something like Information Ethics or IT Ethics, but, in the end, I have concluded that it is best to keep the name of the book the same, if for no other reason than that it is a known product. Wherever possible I use the (cumbersome) phrase computer and information technology rather than just computer technology. This more closely reflects current linguistic practice and understanding of the technology. In fact, the focus of the book may more accurately be described as a focus on a family of technologies that deal with a very wide variety of types of information (signals, data, images, words, etc.).

ACKNOWLEDGMENTS

A number of people helped me with this edition by reading chapters and providing comments and suggestions. I particularly want to thank Keith Miller, Fran Grodzinsky, Roberta Berry, Don Gotterbarn, Andy Ward, and Amy Bruckman for their feedback on various chapters. Marc Pang Quek was a wonderful graduate research assistant and wrote several scenarios based on real cases.

I owe much to many others who have helped me in a wide variety of ways. Scholarship is a collective enterprise. I have learned from reading the ideas of others, from listening and talking to students and colleagues. I am always encouraged by those who are willing to tell me what they like and don't like about my writing.

I will always be grateful to my colleagues in the Department of Science and Technology Studies at Rensselaer Polytechnic Institute. My years there shaped my thinking and my career in profound and permanent ways. In 1998, I moved to the School of Public Policy at Georgia Institute of Technology and I have been delighted to find another robust and lively interdisciplinary environment. My new colleagues and students in public policy and in other units of the Ivan Allen College at Georgia Tech have already enriched my thinking.

In writing this edition, I drew on several of my previously published articles. "Ethics Online" published in Communications of the ACM, a chapter entitled "Democratic Values" in Duncan Langford's book Ethics and the Internet, and "Is the GII a Democratic Technology?" presented at several conferences and then published in Computers and Society. I developed Chapter 1 from an unpublished paper that I wrote for the Tangled Web conference.

As I was completing a draft of this edition, Keith Miller, Laurie King, Tracy Camp, and I received a grant from the National Science Foundation's Program on Course, Curriculum and Laboratory Improvement, to develop materials and hold workshops focused on using the Web to teach computer ethics. The first workshop took place in June of 2000. During the workshop I received extremely valuable feedback on a draft of the book. Materials developed as part of the grant are available at: www.uis.edu/~miller/dolce/ and may be useful to teachers who use this book in their courses.

DEBORAH G. JOHNSON

Read More Show Less

Customer Reviews

Be the first to write a review
( 0 )
Rating Distribution

5 Star

(0)

4 Star

(0)

3 Star

(0)

2 Star

(0)

1 Star

(0)

Your Rating:

Your Name: Create a Pen Name or

Barnes & Noble.com Review Rules

Our reader reviews allow you to share your comments on titles you liked, or didn't, with others. By submitting an online review, you are representing to Barnes & Noble.com that all information contained in your review is original and accurate in all respects, and that the submission of such content by you and the posting of such content by Barnes & Noble.com does not and will not violate the rights of any third party. Please follow the rules below to help ensure that your review can be posted.

Reviews by Our Customers Under the Age of 13

We highly value and respect everyone's opinion concerning the titles we offer. However, we cannot allow persons under the age of 13 to have accounts at BN.com or to post customer reviews. Please see our Terms of Use for more details.

What to exclude from your review:

Please do not write about reviews, commentary, or information posted on the product page. If you see any errors in the information on the product page, please send us an email.

Reviews should not contain any of the following:

  • - HTML tags, profanity, obscenities, vulgarities, or comments that defame anyone
  • - Time-sensitive information such as tour dates, signings, lectures, etc.
  • - Single-word reviews. Other people will read your review to discover why you liked or didn't like the title. Be descriptive.
  • - Comments focusing on the author or that may ruin the ending for others
  • - Phone numbers, addresses, URLs
  • - Pricing and availability information or alternative ordering information
  • - Advertisements or commercial solicitation

Reminder:

  • - By submitting a review, you grant to Barnes & Noble.com and its sublicensees the royalty-free, perpetual, irrevocable right and license to use the review in accordance with the Barnes & Noble.com Terms of Use.
  • - Barnes & Noble.com reserves the right not to post any review -- particularly those that do not follow the terms and conditions of these Rules. Barnes & Noble.com also reserves the right to remove any review at any time without notice.
  • - See Terms of Use for other conditions and disclaimers.
Search for Products You'd Like to Recommend

Recommend other products that relate to your review. Just search for them below and share!

Create a Pen Name

Your Pen Name is your unique identity on BN.com. It will appear on the reviews you write and other website activities. Your Pen Name cannot be edited, changed or deleted once submitted.

 
Your Pen Name can be any combination of alphanumeric characters (plus - and _), and must be at least two characters long.

Continue Anonymously

    If you find inappropriate content, please report it to Barnes & Noble
    Why is this product inappropriate?
    Comments (optional)