Read an Excerpt
Perspectives In Community-Based Conservation
By David Western, R. Michael Wright, Shirley C. Strum, Charles Zerner
ISLAND PRESSCopyright © 1994 Island Press
All rights reserved.
The Background to Community-based Conservation
David Western and R. Michael Wright
The focus of conservation concern and debate has changed throughout history in response to new problems, concerns, and knowledge. One approach, newly emergent, is community-based conservation, or CBC. Community-based conservation arises from within the community—or at least at the community level—rather than internationally or nationally. The irony, of course, is that community-based conservation is hardly new. Communities down the millennia have developed elaborate rituals and practices to limit offtake levels, restrict access to critical resources, and distribute harvests (Croll and Parkin 1992).
Conservation in History
Traditional conservation practices revolved around sustaining food supplies such as fruiting trees or wildlife or protecting cultural symbols, whether totemic animals or religious sites. Conservation, in other words, originated in prehistory as practices that satisfied human needs, not as an altruistic concern for animals and plants. Despite the conservation practices of ancient times, as early as the Paleolithic period of the Stone Age the survival of the wild had more to do with low human population density, limited technology, and undeveloped or restricted markets than with self-imposed human restraint. When resources ran out, new lands for human habitation were always available.
Moving on in pursuit of fresh resources remained an option during the early Neolithic, even as pastoralism and shifting agriculture emerged. Movement, whether nomadic, transhumant, or wholesale relocation, enabled humans to optimize resource use and sidestep the consequences of overexploitation.
Movement didn't entirely obviate the need for conservation or inhibit compassion for other forms of life. Evidence from contemporary traditional societies suggests that a holistic sense of the world was common to most cultures. Many cultures and religions (including the faiths of Hindus, Buddhists, and native Americans) still retain a strong sense of the indivisibility of humanity and nature (Kemf 1993).
Where space was lacking and prey species had evolved in isolation from humans, conservation practices often were ineffective. Evidence from oceanic islands, for example, shows a sharp rise in extinction rates with the arrival of seafaring peoples (Olson 1989). Large-mammal exterminations in the New World during the Pleistocene bear evidence of overkill by early hunters (Martin and Klein 1984). Indeed, traditional conservation practices probably evolved more to maximize and allocate harvests than to conserve supplies (see MALUKU ISLANDS). Moreover, many traditional societies, given modern weapons, overhunt their prey, as discussed in NEOTROPICAL FORESTS. Traditional conservation beliefs, in other words, are not ready-made prescriptions for today's world.
The Rise of Modern Conservation
Populations expanded and grew more sedentary during the Neolithic. Historical evidence points to localized resource depletion and abandonment of agrarian and urban centers as early as 3000 B.C. (Southwick 1976). In classical Greece, Aristotle and Plato wrote almost as persuasively as the twentieth century's Aldo Leopold about landscapes withering under the onslaught of livestock. "What now remains compared with what then existed," Plato noted, "is like the skeleton of a sick man, all the fat and soft earth having been wasted away, and only the bare framework of the land being left" (Rodes and Odell 1992).
By pharaonic times, wildlife was scarce in Lower Egypt. The ruling elite there established the first recorded wildlife reserves in order to assure themselves of quarry on hunting expeditions. A similar devastation of wildlife was repeated across the Middle East, Asia, and Europe as populations grew, settled, and transformed the natural landscape for arable farming, husbandry, and forestry. The same issues arose time and again with each cycle of settlement and resource depletion: Who owns wildlife? Who owns the forest? Who owns the land?
The aristocracy almost invariably won such disputes and denied the peasants who lived on their land or around royal hunting preserves access to wildlife (Thomas 1983). Disputes over forest land and products were particularly contentious, culminating in the rise of forestry practices in eighteenth-century Europe (Nash 1967) and the first forest conservancies, established by the British Raj in India during the mid-nineteenth century (Vedant 1986).
By the 1850s, a new conservation sensibility emerged alongside the romantic movement in Europe and the United States (Nash 1967; Thomas 1983). Humanitarian concerns for the poor, the enslaved, and the disenfranchised soon spilled over into demands for ethical treatment of animals. By 1869, expanding sensibilities led John Stuart Mill to advocate the preservation of species for their own sake, independent of their utility for humans (Thomas 1983).
The rise of a modern conservation consciousness and conscience gathered momentum in the late nineteenth century, as the wilds disappeared and rural communities became urban. Forest reserves, national parks, and hunting laws familiar to twentieth-century conservationists came into being, although nineteenth-century motives were decidedly more political and utilitarian than preservationist. The question of who owned wildlife and who had the right to shoot it, for example, intensified and became closely tied to egalitarianism in the United States and, to a lesser extent, in Europe (Tober 1981). Early national parks mostly were intended to save natural monuments and open space for recreation rather than to preserve vignettes of nature (Runte 1979).
Sustainable use nevertheless was the best way to preserve nature, according to U.S. President Theodore Roosevelt's chief forester, Gifford Pinchot. Pinchot, the self-proclaimed founder of American conservation, advocated efficiency and prudence in the profitable and sustainable use of natural resources. Conservation, in this new doctrine, was "the application of common sense to the common problems for the common good" (Shabecoff 1993). Stripped of its rhetoric, Pinchot's sustainable-use policy signaled President Roosevelt's intention to restrain big businesses' abuse of public lands.
The sustainable-use doctrine also lent legitimacy to efforts to conserve land for the public good. The movement gained an aura of scientific respectability in later years, when mathematical population models were used to calculate maximum sustained yields for natural-resource harvests (Holt and Talbot 1978). But the very pragmatism of Pinchot's wise-use conservation proved abhorrent to the spiritualists and romantics led by preservationist John Muir. The first salvo signaling a deep rift in the conservation movement was about to be fired.
The Diversification of Conservation
The standoff first arose over plans to dam and flood Hetch Hetchy Valley within Yosemite National Park to provide water for San Francisco. Roosevelt and Pinchot came down on the side of exploitation and Muir on the side of preservation. The gap between pragmatists and preservationists widened after World War II, when the archdruid of modern preservationism, David Brower, assumed the directorship of Muir's Sierra Club and opposed dams in Dinosaur National Monument and the Grand Canyon (Shabecoff 1993). In later years, the split widened further when the animal rights and deep ecology movements surfaced and began to champion the interests of species and nature on ethical and moral grounds (Nash 1989).
The preservationists had reason to be skeptical. Impressive as early conservation successes had been in the United States, powerful commercial counterforces waged war on the preservationists. These forces were behind the introduction of laws and policies that encouraged, mandated, and often subsidized the private exploitation of public water, land, timber, minerals, and fisheries (Wilkinson 1992). The underlying goals, which foreshadowed similar resource policies elsewhere, were to boost the United States' national economy, encourage settlement, and strengthen international trade. Once the forces of utilization were unleashed, however, they ran on, blind to ecological limits and environmental destruction. In many other cases, society's ability to sustainably manage living resources ranging from wild species in the Peruvian rain forest (see AMAZON) to trochus shells in Indonesia (see MALUKU ISLANDS) also has proved illusory (Talbot 1993).
Preservationists scored victories in 1908, with the introduction of the wildlife refuge system in the United States, and with the establishment of a series of game reserves and parks in Africa at much the same time. In the developing world, conservation by and large became the state's responsibility, both during and after the colonial era.
State policies and legislation both regulating the use of natural resources and protecting nature continued apace, however, throughout the early part of the twentieth century as population and commerce burgeoned. The rationale echoed those common to Britain's Indian conservancies and Roosevelt's national forests: commercialism and local interests were said to cause environmental destruction inimical to the state. Using this well-honed argument, governments intervened time and again to secure land and resources in the larger interest of society. State land ownership and conservation became unquestioned norms, whether or not they were called for or worked.
Renewable-resource use and preservation have served the environment well, but neither approach has proved sufficient. Both often have fared badly in the face of population growth, poverty, and commercialism. At one extreme, international forces such as trade and economic incentives undermine conservation efforts. At the other, government indifference and incompetence—often intensified by commercial greed, nepotism, corruption, and local hostility—have swelled the tide of destruction. Finally, both utilization and preservation policies falter wherever land tenure and access rights are ill defined. The problem is most acute in areas where national policies deprive local communities of the right to use the resources on their own land. The resulting us-versus-them rush to harvest is the root of resource depletion.
The weaknesses in Pinchot's and Muir's philosophies raise the question of whether prevailing policies, which isolate the interests of local communities from those of the state, are the only or even the best ways to go about conservation. A countertrend, based on the belief that local participation in decisions and benefits could reduce hostility toward conservation efforts, began to emerge in the late 1960s and 1970s (see AMBOSELI). The resulting first small steps in the direction of community participation in conservation were hastened by several developments.
Prelude to Community-based Conservation
The first development involved mounting threats to the environment in the face of careless technology, consumerism, and the population explosion. Rachel Carson's Silent Spring (1962) and the Ehrlichs' Population Bomb (1968) alerted the public to these threats. Earth Day 1970 made environment a household word in much of the world, and the surrounding issues later gained political recognition through the United Nations Conference on the Global Environment held in Stockholm in 1972. Recognition paid off: International conservation conventions mushroomed in the years that followed.
Despite some progress, conservation efforts still revolved around saving high-profile species and habitats. This was to change in the next decade, once the oil crises instilled conservation in Western consciousness and conservationists broadened their horizons to encompass biodiversity and biological processes (IUCN, UNEP, and WWF 1980). Conservation's expanded horizons stretched far beyond parks onto rural lands, where the ultimate threat to biodiversity lay. Just how conservation was to be tackled in rural areas was an issue that remained disturbingly vague, invoking the aspirations of future generations while ignoring the problems of the rural poor (Western 1984).
The second precipitating factor involved grass-roots development. The centrally planned, capital-intensive aid projects begun in the 1950s and based on both altruism and self-interest had done little to alleviate poverty and income disparity in the developing world, despite the grandiose dams, irrigation projects, power stations, roads, and industrial developments that resulted. Integrated rural development (IRD) projects became fashionable but, again, failed with disconcerting regularity. The causes included continued centralization of planning and overly ambitious projects. The grass-roots approach, in contrast, focused on participation and local aspirations (Chambers 1983). To a significant degree, small-scale projects based on resource use did emerge during this period, thus laying a foundation of experience for community-based conservation.
The grass-roots approach recognized rural communities' dependence on sustainable use of natural resources such as soil, water, grazing land, forest products, and wildlife. This recognition conceded the case long made by the Pinchot school. What had been missing in Pinchot's approach, according to rural sociologists, was a local say and stake in resource use. Free to define their own priorities, local communities, in theory, would develop at their own pace and in their own way. They would learn their own lessons and build up their own skills in everything from health care and education to water management and communal forestry (Uphoff 1985).
Grass-roots development was not an unqualified success. The 1970s oil crisis, in particular, put severe economic strain on developing countries. Recently, however, the grass-roots approach has matured and come to play an ever larger role in development programs around the world (Durning 1989; Hirschmann 1993).
The third precipitating factor involved the human rights and indigenous peoples movements. Both drew attention to disenfranchised rural communities such as the Yanomami in Brazil and the Aboriginals of Australia (Berger 1979; Miller 1993). Internationally, developing countries' claims of North-South inequality led to demands for a new world economic order based on redistribution of wealth. Radical grass-roots organizations promoted populist movements as an alternative to government assistance (Hellinger, Hellinger, and O'Regan 1988). As a result, groups that linked social justice for ethnic minorities with environmental health became increasingly vocal.
Environmentalism and Democracy
The upshot of these convergent developments was a heightened sensibility about the environment and the interests of local people. A shift away from the elitism that had dogged the largely urban and Western preservation movement finally was under way. As much as anything, the shift acknowledged the fact that the fate of most of the earth's biological diversity lay in the hands of poor people in the Third World. Conservation and development no longer were John Muir's irreconcilable forces on either side of the divide. In a startling turnaround from the protectionism of earlier conventions, the theme of the Third World Parks Congress of 1982 was CONSERVATION FOR SUSTAINABLE DEVELOPMENT. The published proceedings drew on a handful of case studies to show how protected areas could contribute to human welfare and increase security in the process (McNeely and Miller 1984). The emphasis was still decidedly on buffering parks, but the move from preservation to multiple use of protected areas was clearly under way.
Excerpted from Natural Connections by David Western, R. Michael Wright, Shirley C. Strum, Charles Zerner. Copyright © 1994 Island Press. Excerpted by permission of ISLAND PRESS.
All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.
Excerpts are provided by Dial-A-Book Inc. solely for the personal use of visitors to this web site.