Eve of Destruction: The Coming Age of Preventive War

Eve of Destruction: The Coming Age of Preventive War

by Thomas M. Nichols, Tom Nichols
Eve of Destruction: The Coming Age of Preventive War

Eve of Destruction: The Coming Age of Preventive War

by Thomas M. Nichols, Tom Nichols

eBook

$37.49  $49.95 Save 25% Current price is $37.49, Original price is $49.95. You Save 25%.

Available on Compatible NOOK Devices and the free NOOK Apps.
WANT A NOOK?  Explore Now

Related collections and offers


Overview

In an age of new threats to international security, the old rules of war are rapidly being discarded. The great powers are moving toward norms less restrictive of intervention, preemption, and preventive war. This evolution is taking place not only in the United States but also in many of the world's most powerful nations, including Russia, France, and Japan, among others. As centuries of tradition and law are overturned, will preventive warfare push the world into chaos?

Eve of Destruction is a provocative contribution to a growing international debate over the acceptance of preventive military action. In the first work to identify the trends that have led to a coming age of preventive war, Thomas M. Nichols uses historical analysis as well as interviews with military officials from around the world to trace the anticipatory use of force from the early 1990s—when the international community responded to a string of humanitarian crises in Somalia, Bosnia, and Kosovo—to today's current and potential actions against rogue states and terrorists. He makes a case for a bold reform of U.S. foreign policy, and of the United Nations Security Council itself, in order to avert outright anarchy.


Product Details

ISBN-13: 9780812202946
Publisher: University of Pennsylvania Press, Inc.
Publication date: 04/19/2013
Sold by: Barnes & Noble
Format: eBook
Pages: 192
File size: 339 KB

About the Author

Thomas M. Nichols is Professor of Strategy and Forrest Sherman Chair of Public Diplomacy at the United States Naval War College. His previous books include The Russian Presidency and Winning the World: Lessons for America's Future from the Cold War.

Read an Excerpt

Chapter One
A New Age of Prevention

All Members shall refrain in their international relations from the threat or use of force against the territorial integrity or political independence of any state, or in any other manner inconsistent with the Purposes of the United Nations.
—United Nations Charter, Article 2(4)I don't feel I have to wipe everybody out, Tom. Just my enemies.
—Michael Corleone, The Godfather Part II

A new age of prevention?

The subject of preventive war is a difficult one, not least because it stirs a basic emotion in most people that it is simply wrong. Traditionally, the idea of using force based on a potential rather than actual threat has been viewed in the international community as morally offensive, akin to punishing an innocent person for a crime they might commit but have not. Discussing it in any but the most critical way seems almost to justify it, as it smacks of the gangster's lead-pipe approach to solving disputes and curbing the rise of rival dons. But it is a subject that must be explored as we enter a new age of violence and warfare. We live now in a world where many countries openly ponder whether preventive violence would serve their interests, with some (like the United States, Russia and France, among other examples) flatly defending the right to resort to such measures. It now seems that the norms of the 20th century are no longer going to govern the states of the 21st, and is time to consider the meaning of that change and what might be done in its wake.

Actually, preventive war is not all that new a problem. Striking at potential foes before they can pose a greater threat is a temptation as old as human conflict itself, even if the idea of doing harm to others based on speculation about their motives was rarely considered either prudent or just. (The great Prussian statesman Otto von Bismarck famously referred to preventive war as "committing suicide for fear of death.") However, it was not until the early nineteenth century that the commonsense presupposition against preventive violence took on the moral force of an international norm that could govern relations between states, due to a largely, and understandably, forgotten incident in American history.

In 1837, British militia in Canada destroyed an American merchant ship, the Caroline, that had been aiding anti-British rebels across the Canadian border. The ship was burned and tossed over Niagara Falls; one American was killed during the raid. The consequent dispute between Washington and London produced not only a British apology, but a more formal understanding of the limits of violence in international affairs. As U.S. Secretary of State Daniel Webster would put it some years later, henceforth the resort to violence in self-defense would be judged by whether it was motivated by a necessity that was "instant, overwhelming, and leaving no choice of means, and no moment for deliberation." This famous formulation, the yardstick by which the legitimacy of military action would be measured, became known in international legal usage as "the Caroline test."

It is arguable whether Webster's reasoning had much influence over how states acted—and it surely did little to undermine the culture of preventive thinking that helped to fuel World War I—but as late as the Nuremberg tribunals of the 1940s, "the Caroline test" was reaffirmed by international jurists, almost word for word, as the standard by which states should judge the actions of others. Thus the destruction of an insignificant ship in what one scholar has called a "comic opera affair" in the early nineteenth century nonetheless led to the establishment of a principle of international life that would govern, at least in theory, the use of force for over 250 years, an era in which military action would have to be justified in the clearest terms as an act of legitimate self protection against an opponent whose own actions foreclosed any other options.

Whether we like it or not, that era is now drawing to a close.

It is important to note at the outset that the first part of this book does not make a normative argument about the desirability or morality of the coming of an age of preventive violence. Rather, the intention here is to show that the international system, for better or worse, is already moving toward a more permissive norm regarding prevention; while there are partisans on both sides of this issue who argue both for and against this more permissive norm, those arguments are rapidly being overtaken by events. At this point, the debate must move past the question of whether this norm is being breached, or even whether it should be, and instead explore the implications of the reality that it is already collapsing, and to consider what might be done to maintain any possibility of international order in a new era of prevention.

Although it is tempting to trace these alarming changes in international norms to one or two incidents—the terrorist attacks of September 2001 chief among them—they are actually the result of the cumulative and corrosive effects of a series of frightening, even sickening, events that have been inexorably altering the way the international community thinks about security over the past two decades. Since the Cold War's end, and particularly in the first few years of the 21st century, the world has seen a parade of atrocities: in New York and Washington, of course, murder and destruction on a previously unthinkable scale; in London and Madrid, bombings of public transport; in the Middle East, beheadings broadcast on the Internet; in Russia, mass hostage-takings in a hospital, a theater, and even an elementary school (which resulted the deaths of scores of Russian children). But these outrages did not happen in isolation; they followed a period immediately after the Cold War which saw grotesque campaigns of rape, ethnic cleansing, and even genocide in both Europe and Africa. The nuclear clock, once slowed by the Cold War's end, has been set ticking again by the steady march of the North Korean nuclear program, as well as by the clear intention of Iran's mullahs, and perhaps others, to become members of the nuclear club.

The result of all this is that the peoples and leaders of many nations seem to have reached the limit of their ability to tolerate risk in a world of bewildering and terrifying new threats. Political scientist (and later, U.S. State Department planning chief) Stephen Krasner wrote in 2005 that if a series of nuclear terrorist attacks were to strike three or four cities concurrently in the developed world, "conventional rules of sovereignty would be abandoned overnight," and preventive strikes, including "full-scale preventive wars" without even the pretense of United Nations approval would become accepted practices. The flaw in Krasner's prediction lies only in his timing: much of what he foresees is already taking place. It has not taken a string of nuclear explosions to push many societies to the end of their patience with states and groups that seem to observe no law, custom, or basic human decency, as well as with international institutions that appear impotent at best and obstructionist at worst. If these frustrations deepen, it may well turn out that in the future, international order will be secured not by laws or institutions or even by "coalitions of the willing," but rather, in the words of a British general, by "coalitions of the exasperated."

If this comes to pass, the international system will return to a condition of anarchy and bloodshed not seen since the collapse of the League of Nations. This danger, its origins, and what might be done about it, are the subjects of this book.

Defining prevention and preemption

The most maddening problem with debates about preventive violence is that they are too often muddled by imprecise language, and particularly confusion between "prevention" and "preemption." This lack of clarity is often intentional. Everyone would prefer to strike "preemptively," while no one wants to be accused of striking "preventively." This is because preemption has the noble connotation of striking before being struck, a nimble response by bold and vigilant leaders in the face of an obvious danger. Prevention, on the other hand, has the more sinister connotation of Machiavellian plotting, of plans drawn in secret against enemies real or, more likely, imagined. There are elements of truth in both of these caricatures, but before proceeding further it is important to clarify just what "prevention" and "preemption" actually mean—or at least used to mean.

Preemption is the easier concept to define, since traditionally it only requires concrete evidence of an immediate attack. This could include any number of actions, including the massing of troops, the firming of supply lines, increasingly hostile activity such as reconnaissance across borders, and similar moves indicating impending military action within a very short time, perhaps only days or even hours. Reacting against these preparations in a kind of spoiling attack is still recognized in both law and tradition as a legitimate act of self-defense, much as domestic law would accept the right of an individual to violently stop another person who was about to do harm. As British scholar Lawrence Freedman has put it, "preemption is a . . . desperate strategy employed in the heat of the crisis." It is acceptable because there is no other choice: strike or be struck.

In recent history, the 1967 war between Israel and its neighbors is most often cited as a well-defined case of legitimate preemption, in which the Israelis launched attacks against an Egyptian-led coalition that was in the process of massing an invasion. Indeed, Israeli action was an instance of "anticipatory self-defense" so obvious that, as international legal expert William O'Brien wrote many years ago, it risks being "an extreme case that may seldom be approximated. Not many threatened states would have both the vulnerability of Israel and the high degree of certainty of an imminent attack threatening national existence." Nonetheless, the immediacy of enemy action and the mortal level of threat in the 1967 case continue to define what many scholars would call the "classic" case of legitimate preemption.

Prevention is a far stickier matter, not least because the attacking power has every incentive to claim a more imminent threat than may actually exist, and thus in turn claim the greater legitimacy of preemption. But prevention in its simplest terms means destroying threats before they can actually coalesce, despite the absence of any direct evidence of immediate danger. Another way to think of prevention is to call it discretionary violence, in that it is violence undertaken by choice and with deliberation, proactive rather than reactive. (Critics might argue that there is a better phrase to characterize preventive attack: "unprovoked aggression.") This is not self-defense necessitated by an evident threat, but rather by the desire to eliminate even the possibility of a future menace. This is sometimes a choice made based on "now-or-never" calculations, in which the attacker comes to believe that a decision to forego military action in the near-term means risking that the target will become too strong to deal with in the future.

A good example can be found in the first great history of war, Thucydides' account of the tragedy of the Peloponnesian War in the 5th century BC. The Spartan decision to go to war with Athens, although driven by many disputes that had arisen between the two city-states over the years, was nothing less than a preventive war, in which the Spartans decided to destroy the Athenians before Athens became so great a foe that the opportunity would slip away. Thucydides describes the Spartan reasoning in the unmistakable language of preventive war:The power of the Athenians had advanced so unmistakably and [Sparta's] own alliance was so threatened, that they decided it could no longer be tolerated. They resolved that every effort was to be made and Athenian strength was, if possible, to be destroyed by the undertaking of this war.The actual casus belli that provided the immediate spark to twenty-seven years of war was a confrontation between Athens and a Spartan ally over a dispute with a third state in a far-off corner of Greece that did not remotely threaten Sparta itself. But despite the reality that there was no direct threat from Athens, the Spartans felt the balance of power turning so rapidly against them that they chose war when they could still fight one—they hoped—on their own terms.

More recent examples can be found during World War II. The Japanese attack on the United States was a preventive war, designed to disable American naval power long enough for Japan to complete and consolidate its empire in the Pacific. The Japanese did not contemplate a long war with the United States, and some Japanese leaders even assumed that the Americans, once they overcame their initial anger, would decline to fight any further. Likewise, Nazi Germany's invasion of neutral Norway in 1940, out of fears that the British get to it first, was later condemned by international jurists as a preventive attack against a target that did not present an imminent threat to Germany.

By comparison, had Soviet forces decided to strike at the huge Nazi invasion force massed on its border in June 1941, or had the Americans struck the Japanese fleet en route to Pearl Harbor, these would have been preemptive, rather than preventive actions. The Japanese, and particularly the Nazis, were almost recklessly cavalier about making clear their intentions to attack and were in the process of mobilizing their forces to do so.

Notions of preventive war did not end with the fall of the Axis. After the war, when the Soviet Union finally developed nuclear weapons, there was a short-lived debate in the United States over whether to wage preventive war against Moscow while the nuclear balance was still in the West's favor. This was not some crackpot idea hatched by paranoid extremists on the anticommunist fringe; its advocates included William Laurence, the top science writer at The New York Times, as well as such noted intellectuals as Bertrand Russell, physicist Leo Szilard, and renowned mathematician John von Neumann, who remarked in 1950: "If you say why not bomb them tomorrow, I say why not today? If you say today at 5 o'clock, I say why not one o'clock?" Some within the U.S. Air Force, in particular, were drawn to a preventive solution to the Soviet dilemma: "We're at war, damn it," said General Orvil Anderson, commandant of the U.S. Air War College in 1950. "Give me the order to do it, and I can break up Russia's five A-bomb nests in a week. And when I went up to Christ—I think I could explain to Him that I had saved civilization." Anderson was not alone in his views, but President Harry Truman dismissed him from his post in any case for his public advocacy of the idea.

In the end, the idea of preventive attacks on the Soviet nuclear program was rejected both on the practical grounds of avoiding a major war with the USSR, and because initiating an unprovoked war was culturally unacceptable to American leaders. Secretary of the Navy Francis Matthews, a strong advocate of prevention, advertently recognized this problem in 1950 when he said that such a war would make America "the first aggressors for peace," but that America had no choice but to embrace that responsibility. (This time Truman chastised, but did not fire, his wayward preventionist.) This aversion to striking first was evident in Robert Kennedy's objection to bombing Soviet missiles in Cuba in 1962, which he likened to a second Pearl Harbor. And yet, the issue arose again little more than a year later, when communist China stood on the threshold of gaining the bomb. The expression that was used when President John Kennedy's advisors were debating whether to destroy communist China's nascent nuclear program is a perfectly evocative description of preventive war itself: the decision to engage in preventive violence is a decision to "strangle the baby in the cradle."

But what, really, is the difference between the acceptable concept of preemption and the previously prohibited policy of prevention? The temptation to smudge the difference between them is understandable, largely because it is difficult to tell exactly when a possible threat to national security has germinated long enough to become a real threat. Along these lines, one U.S. Air Force officer has suggested differentiating between preventive "war" and a preventive "strike," which is a "short-duration military action designed to remove an enemy's capability before it can be used against us," but this, especially from the enemy's view, is probably a distinction without a difference.

A related problem is that prevention can, in a way, be a victim of its own success. Preventive war, as foreign policy thinker Michael Mandelbaum has pointed out, "has a self-canceling quality to it. If it is successful it removes the threat that, were it to grow to menacing proportions, would clearly justify military action. It removes, in effect, the evidence that would convince people of the wisdom of waging war."

Traditionally, the difference between preemption and prevention was found in calculations about timing. Preemption responds to "imminent" threats, while prevention strikes at notional or nascent threats. But the word "imminent" itself presents a thorny definitional problem when discussing preemption and prevention. While the concept of "imminence" has long been central to the distinction between prevention and preemption, it has also always been mired in debate about when the test of imminence has been met, and in the end the issue often comes down to a matter of subjective human perception. Like Justice Potter Stewart's well-known comment about pornography, imminence is hard to define, but most people think they know it when they see it. At the very least, harking back to Webster and the Caroline, imminence seems associated with choice: a threat is imminent when the target has no other means to avert it other than to attack. There is no time for diplomacy or negotiation, no ability to appeal to third parties for intercession with the aggressor, no opportunity to neutralize the threat through nonviolent measures. In this way, the condition of "imminence" legitimizes preemption and criminalizes prevention by grounding the use of violence in the classical requirement of just war—which is the basis for much of international law and custom on the subject—that war should be a last resort.

Unfortunately, "imminence" has never been a particularly well-defined guide to action. Even now there is a significant debate about whether the term, insofar as it was ever understood in the first place, has lost its relevance in an era in which threats may come from terrorists and rogue states who may not be willing to do us the service of signaling their intentions with the obvious movements of regular, uniformed military units. This anxiety about the increasing difficulty in identifying threats, a problem magnified both by new technologies and new and unpredictable political actors on the international scene, has in turn created growing confusion and disagreement about the legitimate use of force.

From Melos to Baghdad . . . and beyond

The accelerating erosion of norms against preventive or discretionary violence is especially striking given how strongly the international community only recently professed its adherence to them. Of course, during the Cold War, as historian Melvyn Leffler has pointed out, "preventive action in the Third World was standard [American] operating procedure," an observation easily applicable to the Soviets as well. Still, the presumption against intervention was so powerful that the two strongest nations on earth felt the need to show their respect for it in principle even in those moments when they felt the need to disregard it in practice: both the Soviets and the Americans dressed their actions in veils of legitimacy regarding "fraternal assistance" and "self-defense" when in fact what they were primarily doing was imposing ideological order in their spheres of control.

But even as they crushed rebellions or tamed unruly allies, they never sank—at least publicly—to the moral poverty of the ancient Athenians at the island of Melos during the Peloponnesian War. The Melians were supposedly neutral in the grinding, endless conflict between Athens and Sparta, but the Athenians decided to seize the island anyway as a precaution against any possible support for their enemy. In an infamous declaration, they told the besieged Melians that there was no need to trifle with arguments about justice or right, and that Melos must submit to Athenian rule because it was the nature of things that "the strong do what they can and the weak suffer what they must." By contrast, actions that could just as easily have been dictated by Moscow or Washington were clothed in legal language that ironically honored the strength of the norm against discretionary uses of force even as it was being violated.

The idea that the world is shifting away from these Cold War norms toward a greater acceptance of discretionary uses of force may seem an odd claim given the amount of international fury directed at U.S. President George W. Bush's policies, which are fundamentally preventive in nature, regardless of efforts by the administration to portray them otherwise. The so-called "Bush Doctrine," enunciated in the 2002 and 2006 National Security Strategy of the United States of America, despite its use of the term "preemption," describes a strategy of prevention with such unapologetic candor that some critics have derided it as little more than a barely-veiled justification for the creation of an American empire in which any state or actor resisting American hegemony would suffer Washington's wrath. The 2003 American-led invasion of Iraq—the Bush Doctrine in action—served to confirm the worst fears of the administration's critics at home and abroad, as American forces rolled into Baghdad after the expiration of the President's ultimatum that the Iraqi regime, in effect, either surrender or be destroyed. The Americans, it seemed, had arrived at Melos . . . via Baghdad.

President Bush and his advisors have at times claimed that the United States was fighting a preemptive action in Iraq against an imminent threat, a claim that not even the most generous interpretation of those words can support. Even the president himself on the eve of the war rationalized the invasion in preventive rather than preemptive terms, warning that the risks of inaction were too great and that in "one year, or five years, the power of Iraq to inflict harm on all free nations would be multiplied many times over."

Given the stunning embarrassment of finding that there were no nuclear, biological, or chemical weapons stashed away in Saddam Hussein's palaces, and the consequent miring of American forces in the midst of ongoing sectarian violence in Iraq, it would be logical to expect that the Bush administration's distortion of the concept of preemption would serve to delegitimize both preemption and prevention as viable policy choices. And yet, as two American analysts (both opponents of the Bush Doctrine) noted in 2006, despite the morass in Iraq,[a] mounting body of evidence suggests that a significant number of states are beginning to embrace the Bush Doctrine's underlying logic of "preemption," which seems a great deal like preventive war, despite their initial hostility to the Bush Doctrine and continuing widespread opposition to the [2003] Iraq war.This, they find, represents a trend among "numerous states and international organizations . . . to revise long-held international understandings about when force might be used" that is nothing less than "startling."

This is a puzzle that needs explaining. Are other states seizing on the American example out of opportunism, or even just self-defense? This is a central accusation of critics who have charged that for many reasons, U.S. policies will "invite imitation and emulation, and get it."

To claim, however, that the Americans—or others, for that matter—are leading a change in international norms risks confusing cause and effect. Analyses that trace these developments to U.S. policies after 2001 cannot explain a striking evolution in beliefs about the use of force on the part of several important actors in the international community over the past twenty years, changes that have been spurred more by the rapid decay of international order since the end of the Cold War than by any single state or actor.

First, traditional injunctions against interference in the internal affairs of sovereign nations have been relaxed, a change which has its origins in the humanitarian disasters of the late 20th century and which long predates the current debate over preventive war. The very concept of the "state system" itself, in which states were assumed to be unitary and rational actors, has been challenged by the emergence of so-called failed states, zones of chaos where international norms, to say nothing of international law, hold little or no sway.

Second, there has been a steep erosion of faith among both ordinary citizens and political elites in the West in the concept of deterrence. This is a problem rooted in an increasing unwillingness to trust the rationality of terrorists, rogue states, or reckless actors who seem to care little for their own safety—and who may even venerate suicide.

Finally, Western policymakers have been haunted by a stark fear that reckless or suicidal actors could gain weapons of mass destruction. The concurrent spread of ballistic missile technology is a particularly deadly part of the proliferation equation. It is instructive to recall that these are no longer exotic technologies: the atomic bomb is now over sixty years old, and the first launch of a satellite took place a half century ago. The ability of small actors to inflict immense damage on even the greatest of the great powers—and should they be armed with ballistic missiles, to inflict it unavoidably and immediately—has never been greater and has no parallel in modern history.

The September 2001 terrorist attacks against the United States added greater, even frantic, urgency to these fears. With 9/11, mass suicide terror went from being only a notional concern to a fully realized threat, and the now-demonstrated potential of mass suicide terrorism served as a powerful spur to further thinking about preventive violence. But the preconditions for major changes in the international system were in place long before the first plane ever struck the Twin Towers. As John Gaddis has written, al-Qaeda's attack on the United States was only the final blow to traditional understandings and norms regarding the use of force: "The old distinction between preemption and prevention . . . was one of the many casualties of September 11."

Traditional norms prohibiting preventive or discretionary uses of violence are largely gone, or at best, are in their last years, and it is unlikely they will be restored, if ever, for decades to come. Lamenting their passing is understandable but unproductive; the more pressing business now is to consider how we got here and what to do about it.

Overview

The advent of a new age of preventive violence actually required the collapse of not one, but two previous norms: one was the presumption, of course, against preventive war itself, but the other is the notion logically prior to that presumption, the inviolability of state sovereignty. The overturning of this latter norm, perhaps one of the most important, if relatively unheralded, changes in international life since the establishment of the Westphalian order over three centuries ago, is discussed in Chapter Two. Chapter Three will consider the changed security environment since the end of the Cold War, and will focus particularly on the question of deterrence—and whether anyone believes in it anymore.

Much of the international debate about how to respond to the threats of the 21st century might have been more muted, and seemed less urgent, had it not been spurred both by the shock of the 9/11 attacks in 2001 and the American unveiling of a new security strategy in 2002. Chapter Four will examine not only the international reaction to the U.S. National Security Strategy, but also how states and their leaders have reacted overall to the changes in norms and perceived threats discussed in the previous sections. If advocates of preventive war were found only on Pennsylvania Avenue or Downing Street, or if schemes of prevention were only a temporary partisan fascination among particular political circles, there would be far less reason to be concerned about the likelihood that longstanding international norms might be unraveling. But the rise of prevention is a global phenomenon; many members of the international community, great and small, are reaching the conclusion that their interests are no longer being served by previous norms regarding war, and this means that the increasing use of preventive violence is not likely to be a limited or transient problem.

It is, of course, one thing to talk about preventive war, and entirely another actually to launch one. Chapter Five will explore the circumstances surrounding the 2003 invasion of Iraq and its subsequent impact on the legitimacy, and wisdom, of preventive violence. It is possible that the death and chaos in Iraq will emerge as the best argument yet for holding on to whatever is left of the nonintervention norm. But it is also possible, even likely, that large-scale actions like Iraq will be undertaken rarely and with great caution, while smaller acts of violence, including covert operations, interdictions, targeted killings and the like—many of which will involve breaching state borders and will technically constitute acts of war—will be more prevalent in this first generation of the new age of prevention.

But how will such violence be governed, judged, or sanctioned . . . if at all? Chapter Six will consider various proposals to this end, and discuss possible courses American foreign policy might take in constructing and leading a new international order.

When the Cold War ended, its exhausted participants turned inward, finally freed from the half-century death grip they had held around each others' throats. The Americans elected a center-left president in 1992 whose campaign staff chanted the famous mantra "It's the economy, stupid," while neo-isolationists on the American right thundered about the need to "put the Denver Boot"—the device used to immobilize the cars of parking ticket scofflaws—on Air Force One and keep the president's plane at home. The defeated Soviet regime disbanded itself and the Russian people set about the task of rebuilding their shattered economy and society. Both superpowers withdrew their forces and their material support from areas where they had formerly competed (places, as the former Soviet ambassador to the U.S. later wrote, only historians now can name) and left behind all the tensions, animosities, and conflict that had predated their arrival decades later. Claims that had once been made in Washington and Moscow about how utterly critical these outposts were to Soviet or American national security seem now almost comical in retrospect, but the situations they left in the wake of their mutual retreat were no laughing matter.

As the leaders in some of these now-unsupervised regions took to the business of massacring their own populations, literally hacking hundreds of thousands of them to pieces, the world looked on in confusion and horror. And it began to dawn on many in the international community that being a "state," with all the attendant rights and protections traditionally associated with that word, might require something more than a flag and a name scribbled on a map. Perhaps more important, an idea began to form that human beings have responsibilities to each other that might be even more important than the legal rights of "governments" hardly worthy of the name. How those responsibilities might be fulfilled was less clear, and remains a contentious issue to this very moment.

We turn now to the story of how these changes in beliefs came about, and how they created the possibility of saving thousands of lives, while also opening the door, for better or worse, to the new age of prevention.

Table of Contents

Preface
Chapter One. A New Age of Prevention
Chapter Two. Humanitarian Intervention, Sovereignty, and Prevention
Chapter Three. The End Of Deterrence?
Chapter Four. International Perspectives on Prevention
Chapter Five. After Iraq
Chapter Six. Governing the New Age of Prevention
Afterword. Now What?

Notes
Index
Acknowledgments

From the B&N Reads Blog

Customer Reviews