Related collections and offers
|Publisher:||University of California Press|
|Edition description:||First Edition|
|Product dimensions:||5.90(w) x 8.90(h) x 0.90(d)|
About the Author
Read an Excerpt
The Invisible Keystone of the Modern World
All animals, including human beings, consume food for energy. Every human acutely recognizes the imperative to eat or perish. This form of energy is not invisible. Similar as we may be to other animals in terms of food, humans uniquely acquired fire, which brought light, warmth, and protection from predators. Of equal importance, fire cooked food, and its advantages separated our evolutionary pathway from that of our other primate cousins.
Wood fires, combined later with beasts of burden and a little water and wind, powered human society for thousands of years. In the 1500s, the enormous energy from coal began to supplant the earlier sources in England. Later, oil, gas, and uranium joined coal as the big-four primary energy sources or fuels. In the late 1800s, a new form of water power, hydroelectricity, joined the big-four fuels, and these five now supply most energy in the world, outside the unique role occupied by food.
Based on these energy sources, people leaped from the agrarian to the modern, industrial world, and the material benefits of the big-four fuels lie beyond dispute and beyond calculation. Despite the keystone centrality of energy to modern human life, most people think little about it. These forms of energy shrink to invisibility, which makes us vulnerable to the problems they pose. Exploring the pathways to fire, food, and subsequently the big four brings the keystone of modern life into focus.
THE FIRST ENERGY TRANSITION: HOMO EMBRACES FIRE
Evolutionary processes — long before the appearance of primates — established food as the energy foundation for all animals, but humans are different from other animals in their reliance on cooked food. Although many animals, including nonhuman primates, prefer cooked food to raw, only Homo fully mastered the use of fire. Darwin speculated that learning to use fire ranked with language as one of the most important traits determining human evolutionary success. Chimpanzees may be able to understand the behavior of fire and thus avoid wildfires without panic, but they don't regularly make use of it. Only humans fully integrated fire into their normal daily behavior.
The use of fire for warmth, light, protection, and cooking, however, does not lie far in the antiquity of evolution. In 2012, microscopic remains of plant material, bones, and minerals in a cave in South Africa showed that regular use of fire was occurring in the cave about one million years ago, and the materials were unlikely to have originated in any way other than regular use of fire by Homo erectus, a species that appeared between 1.9 and 1.5 million years ago. Other firm evidence for fire dates to about 780,000 years ago at Gesher Benot Ya'aqov in Israel, before the evolution of Homo sapiens.
Archaeological evidence of fire is persuasive that early hominins used it regularly, but anthropological findings suggest that hominins began to use fire about the time that Homo habilis disappeared and Homo erectus appeared. Significant reductions in the size of teeth and the volume of the gut suggested habilis maybe and erectus for sure relied on cooked food. It is easier to digest, and organisms extract more energy from it than they do from raw food. In addition, reliance on cooked food requires considerably smaller amounts of time devoted to eating and chewing.
Homo erectus possessed distinct traits consistent with survival by the use of fire in addition to its smaller gut and teeth. This hominin had lost the ability to move about on all four limbs and to climb trees adroitly. It slept on the ground, and to avoid predators it may have used fire for protection as well as warmth and light. The finding of regular use of fire by Homo erectus in South Africa one million years ago supports these inferences.
If Homo erectus, an evolutionary predecessor of Homo sapiens, had mastered fire, then in all likelihood use of fire was an integral part of human life from before the time that modern humans evolved. Nowonly Homo sapiens regularly and mandatorily uses fire, and no people live without it. If this reasoning is correct, then mastery of fire became "natural," and traits supporting the mastery of fire lie in the human genome. Only Homo, the primate genus that completely embraced fire, colonized the entire globe in ever increasing numbers. Embrace of fire was evolutionarily very successful, and, as some have quipped, perhaps Homo sapiens should be named Homo incendius.
THE SECOND ENERGY TRANSITION: HOMO SAPIENS LEARNS TO FARM
Until about 10,000 years ago, Homo erectus and then Homo sapiens survived and expanded to all continents except Antarctica. Populations grew slowly and, based on changing climates, sometimes contracted. Human life relied on a foundation of food to run bodies and fire to heat, light, cook, and protect against predators. Survival of the species required no further advance in the mastery of energy, but a few scattered settlements built a new energy economy by domesticating plants and animals for agriculture, a change that vastly increased the availability of food and thus energy supplies. Farming and animal husbandry may have originated with improvement of climate after the last ice age, and it enabled settled living as opposed to nomadism, hunting, and gathering. Settled living in turn enabled the rise of cities, written languages, social divisions, and vastly faster development of new or more refined materials like ceramics, metal tools, and jewelry.
Anthropologists named this change the Neolithic Transition, but this book uses the term Second Energy Transition. No comparable name demarcates hominins before and after fire, but here it's called the First Energy Transition. Embrace of fire and agriculture underlay a lifestyle that persisted in nearly all human cultures from about 10,000 years ago to 1600. By that time, some hunting-gathering cultures survived using only gathered food and fire, but most people derived most of their food energy from domesticated plants and animals and "extra" energy from wood fires. Some people supplemented food and fire with windmills and waterwheels to harvest small amounts of energy from wind and falling water.
This was the agrarian economy in which most people tilled the soil and a much smaller proportion served as merchants, artisans, scholars, priests, soldiers, government servants, and rulers. Civilizations rose and fell in Asia, Europe, Africa, and the Americas, and these various cultures steadily increased both technical prowess and academic learning. A hallmark of all agrarian economies, however, was that they drew energy supplies solely from the yearly input of solar energy. Photosynthesis made "biomass," which provided food, feed for animals, fiber for clothing, and woody materials for fire, tools, and shelter. Wind and falling water came indirectly from the heat of the sun.
The historian Alfred Crosby named Homo sapiens "children of the sun." They were much more energy-rich than they had been as hunters and gatherers, but their material wealth was constrained by the annual input of solar energy harvested by plants, windmills, and waterwheels. Greater amounts of stable food energy fueled population growth that could not have occurred based on the food supplies available from hunting and gathering.
In the minds of classical economists like Adam Smith, David Ricardo, and Thomas Malthus, the creation of wealth depended on three elements: labor, capital, and land. Land, however, really represented energy, because photosynthesis for food, feed, and fiber depended on the amount of land controlled.
Classical economists, especially Malthus, were highly pessimistic about the improvement of material living conditions above subsistence levels. For Malthus, a small minority, through provident behavior, might aspire to a more comfortable material standard of living, but the vast majority of humanity must live with much less. As Malthus famously said, the geometric potential for population to increase would always in the end outpace the ability of land to provide more food and other goods. If population levels dropped, then the bulk of humanity might temporarily have a richer life, but the proclivity to reproduce would in the end bring population levels back up to the maximum that land could support. At that point, mortality would balance fertility, and inevitably, Malthus argued, most people would lead an impoverished life of bare subsistence.
THE THIRD ENERGY TRANSITION: HOMO SAPIENS CREATES THE MODERN WORLD
People living in "developed" countries think of themselves as "modern," based on democracy, nation-states, individualism, economic systems to organize capital investments for growth, science, industry built with new technology, and the idea of progress. Sometimes modernity distinguishes itself from predecessors with negatives: not feudal, not an absolute monarchy, not agrarian, not rural, and not superstitious. In a modern society, most people live in cities and do not farm, the biggest contrast with agrarian societies.
A modern person's material life has far more "stuff" and "conveniences" than even royalty and the wealthiest premodern societies commanded. What medieval monarch in Europe, for example, could enjoy a hot shower with clean water by turning a valve, a ride to another continent in a comfortable jet, painless surgery to heal an injured joint, and instantaneous communication with his far-flung armies?
Material abundance characterized the "modern world" as much as did the standard components: nation-states, democracy, large business organizations, and scientific enlightenment. A philosopher living in Britain, France, or the United States in 1800 could point to great changes in politics, new scientific knowledge, and new ways of organizing economic activity, all in a nation-state that transcended individual leaders and governments.
Yet the vast majority of people in these three countries remained mostly rural and lived very much like their ancestors of 1,000 or even 6,000 years earlier. They farmed with human and livestock muscle power. If they traveled at all, it was on foot, horseback, or wind-driven ship. Their housing and water supply had changed but little. At night, the world darkened except for the feeble light of candles. They had a few more iron, bronze, or brass tools and ornaments. Maybe their clothes included textiles woven in the newly mechanized mills of Lancashire, but probably they wore homemade clothes. A person from 2000 suddenly launched backward to 1800 would be hard pressed to feel that he or she was still in the modern world, even if democracy, freedom from royal tyranny, and scientific knowledge animated public conversations.
The transition from premodern to modern life, in short, rested heavily on material shifts in living circumstances. Without the huge shifts in material life, most of which occurred after 1800, life in the 2000s would have continued to look amazingly like that of over 200 years ago, which in turn looked not all that different from 8,000 years ago. Mastery of energy sources and technology created the Third Energy Transition with major consequences, but all too often the centrality of energy remains underappreciated and ignored.
The economic historian E.A. Wrigley, in his studies of the English industrial revolution, rectified the oversights about energy. He had a vastly richer set of concepts from the physical and biological sciences on which to draw compared to Smith, Ricardo, and Malthus. After themid-1800s, the physical concept of energy, defined as the ability to do work, became a fundamental part of science, and scientists could measure it quite precisely in units like joules, kilowatt-hours, and calories.
Wrigley drew from biology and ecology to embrace the concept of ecosystems with energy flows and material cycling. British ecologists in the 1920s and 1930s had borrowed from economic thinking to integrate ideas of producers, consumers, ecosystems, efficiency, and energy flow into biology. Wrigley returned the favor by bringing the refined concepts of ecologists back into economics.
He noted that agrarian civilization rested on "organic energy" supplied entirely by the annual flux of solar energy into the biosphere. People harvested this energy directly as food and feed produced by photosynthesis and indirectly from livestock that fed on plants. Firewood plus other plant and animal products supplied fire for light and heat, which had many uses. People also harvested smaller amounts from wind and water power, both driven by solar energy.
Increasing use of coal in place of firewood started in England in the 1500s and ultimately underwrote a new energy economy and vastly expanded the industrial revolution. These events moved first and fastest in England and were virtually complete by 1850. Wrigley named the new regime the "mineral energy economy," which eclipsed the older agrarian "organic energy economy." Agriculture and animal husbandry didn't cease, of course; they remained the primary source of food and feed for almost all of an increasingly large human population. Firewood remained important in economies not yet industrialized.
Wrigley reconceptualized the industrial revolution, which for him rested on the immense supplies of energy that coal provided compared to that supplied by the organic energy economy. It's not that other factors and changes weren't also important as causes or consequences of the Industrial Revolution. To ignore the liberation of human life from the constraint of the annual flux of solar energy, however, was to miss the main point.
Wrigley was one of a long string of historians who attempted to make sense of the industrial revolution, which was so easily visible after 1850. For example, Arnold Toynbee, in his 1884 essay, The Industrial Revolution, generally received the most credit for the term, and he celebrated the increased abilities to make things for an easier life. But he lamented the unevenness with which the benefits were shared among different classes of people. Political reform, argued Toynbee, must spread the benefits more evenly.
Karl Marx also postulated that human beings had reached a new stage of development in which dearth of material goods should no longer plague human life. Like Toynbee, Marx argued for more equal sharing, but he argued that this would require a revolution driven by the working class to sweep away the capitalist class. William Stanley Jevons took quite a different tack from Toynbee and Marx: he worried about the future supplies of coal and the possibilities for continued expansion of the British economy. For Jevons, the issue was how to keep the good times rolling in the face of projected future rises in coal prices.
Economists and historians focused on the multiple dimensions of the industrial revolution. How did labor and capital form in factories? How did growth of national and per capita income cause or change in the industrial revolution? When and why did labor move out of agriculture into cities and factory work? What inventions of new machines drove the productivity of labor upward? What role did coal play? Why and when did the changes in England spread to other regions and countries? What consequences followed?
All of these perspectives are valuable, but they reflect the invisibility of energy that characterized the 1900s. Yes, energy involved ideas, costs, politics, social impacts, and technology, but in the 1900s scholarship too often took energy for granted. Wrigley, in contrast, focused on energy as a sine qua non in the modern world. In the 2000s, climate change, health effects, geopolitical tensions, and the difficulties of procuring fossil fuels demand a focus on the pivotal role played by energy.
The Third Energy Transition developed between about 1600 and the 1950s. It began with the sustained increase in the use of coal in England at the start of the 1500s. Increasing use of coal continued in the 1700s through 1900s, supplemented with petroleum, natural gas, and hydropower. The last fuel of the Third Energy Transition came in the 1940s and 1950s when three countries started to use the heat of uranium fission, first for explosives and then to make electricity. Controlled fission made uranium, plutonium, and thorium into actual or potential fuels to produce heat.
Like the fossil fuels (coal, petroleum, gas), uranium is a mineral fuel, mined from the earth. Heat from all four of these mineral fuels frees humanity from the constraint of annual fluxes of solar energy to the earth. Each of the four fuels is also "energy dense"; that is, each can provide high amounts of heat per kilogram of mass compared to, for example, firewood, solar, and wind energy (for more details, see appendix 2). In addition, like the other fuels, uranium creates benefits as well as problems (chapter 7). It thus shares many important characteristics with coal, petroleum, and gas, which makes it fit easily into an assessment of the Third Energy Transition.
Excerpted from "Changing Energy"
Copyright © 2017 John H. Perkins.
Excerpted by permission of UNIVERSITY OF CALIFORNIA PRESS.
All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.
Excerpts are provided by Dial-A-Book Inc. solely for the personal use of visitors to this web site.