The Knowledge Web: From Electronic Agents to Stonehenge and Back--And Other Journeys Through Knowledge

Overview

In The Knowledge Web, James Burke, the bestselling author and host of television's Connections series, takes us on a fascinating tour through the interlocking threads of knowledge running through Western history. Displaying mesmerizing flights of fancy, he shows how seemingly unrelated ideas and innovations bounce off one another, spinning a vast, interactive web on which everything is connected to everything else: Carmen leads to the theory of relativity, champagne bottling links to wallpaper design, Joan of Arc...

See more details below
Available through our Marketplace sellers.
Other sellers (Hardcover)
  • All (57) from $1.99   
  • New (2) from $4.50   
  • Used (55) from $1.99   
Close
Sort by
Page 1 of 1
Showing All
Note: Marketplace items are not eligible for any BN.com coupons and promotions
$4.50
Seller since 2014

Feedback rating:

(2)

Condition:

New — never opened or used in original packaging.

Like New — packaging may have been opened. A "Like New" item is suitable to give as a gift.

Very Good — may have minor signs of wear on packaging but item works perfectly and has no damage.

Good — item is in good condition but packaging may have signs of shelf wear/aging or torn packaging. All specific defects should be noted in the Comments section associated with each item.

Acceptable — item is in working order but may show signs of wear such as scratches or torn packaging. All specific defects should be noted in the Comments section associated with each item.

Used — An item that has been opened and may show signs of wear. All specific defects should be noted in the Comments section associated with each item.

Refurbished — A used item that has been renewed or updated and verified to be in proper working condition. Not necessarily completed by the original manufacturer.

New

Ships from: Brookville, OH

Usually ships in 1-2 business days

  • Canadian
  • International
  • Standard, 48 States
  • Standard (AK, HI)
  • Express, 48 States
  • Express (AK, HI)
$45.00
Seller since 2014

Feedback rating:

(136)

Condition: New
Brand new.

Ships from: acton, MA

Usually ships in 1-2 business days

  • Standard, 48 States
  • Standard (AK, HI)
Page 1 of 1
Showing All
Close
Sort by
The Knowledge Web: From Electronic Agents to Stonehenge and Back--And Other Journeys Through Knowledge

Available on NOOK devices and apps  
  • NOOK Devices
  • NOOK HD/HD+ Tablet
  • NOOK
  • NOOK Color
  • NOOK Tablet
  • Tablet/Phone
  • NOOK for Windows 8 Tablet
  • NOOK for iOS
  • NOOK for Android
  • NOOK Kids for iPad
  • PC/Mac
  • NOOK for Windows 8
  • NOOK for PC
  • NOOK for Mac
  • NOOK Study
  • NOOK for Web

Want a NOOK? Explore Now

NOOK Book (eBook)
$14.53
BN.com price

Overview

In The Knowledge Web, James Burke, the bestselling author and host of television's Connections series, takes us on a fascinating tour through the interlocking threads of knowledge running through Western history. Displaying mesmerizing flights of fancy, he shows how seemingly unrelated ideas and innovations bounce off one another, spinning a vast, interactive web on which everything is connected to everything else: Carmen leads to the theory of relativity, champagne bottling links to wallpaper design, Joan of Arc connects through vaudeville to Buffalo Bill.

Illustrating his open, connective theme in the form of a journey across a web, Burke breaks down complex concepts, offering information in a manner accessible to anybody — high school graduates and Ph.D. holders alike. The journey touches almost two hundred interlinked points in the history of knowledge, ultimately ending where it begins.

At once amusing and instructing, The Knowledge Web heightens our awareness of our interdependence — with one another and with the past. Only by understanding the interrelated nature of the modern world can we hope to identify complex patterns of change and direct the process of innovation to the common good.


Read More Show Less

Editorial Reviews

Publishers Weekly - Publisher's Weekly
Continuing in the vein of The Pinball Effect, his unconventional history of technological change, Burke offers 20 new historical "story lines" that attempt to demonstrate the interactive, often serendipitous connections among ideas, events, people and innovations. His style matches his subject as he skips from one topic to another, moving at the speed of hypertext. The chapter on feedback systems hops from neural networks--computers that simulate the human brain's workings--to studies of the physiology of animal emotion, Cyrus Field's pioneering transatlantic telephone cable in 1857 and thence to Napoleon, James Watt, Arts and Crafts movement leader William Morris and Theosophist Annie Besant. Burke always risks being charged with carrying on an intellectual parlor game that trivializes the history of science and invention, of stretching the maxim "everything is interconnected" to the point of meaninglessness. But because his material is intrinsically interesting and because Burke is a superb raconteur, his maverick guide to the byways of Western civilization is entertaining when consumed in small segments. This manic, associative tour of the cultural underpinnings of technological advancement fast, sexy and packed with information; but it's ultimately shapeless and provides little in the way of deeper understanding. (June) Copyright 1999 Cahners Business Information.
Library Journal
Flights of fancy from a sci-tech expert, e.g., what do Buffalo Bill Cody and the Spanish Inquisition have in common?
Kirkus Reviews
Back playing his theme music—"the process by which new ideas emerge is serendipitous and interactive"—is the hugely entertaining Burke (The Pinball Effect, 1996, etc.). He's off on another of his joyrides, following the often bizarre pathways that lead from one idea to another, following like a bloodhound the threads that link events and notions and personalities. And he doesn't just list the things passing strange before his purview, he stops to examine them and deliver a smart little explication. He's not just amused to learn that the Magnetico-Electrico Celestial Bed, wherein the administrations of shocks to the participants was said to "ensure immediate conception," can be found on the road to the cornflake, he wants readers to know why. And it's pure pleasure to read as he unravels the skein knotting the pugnacious father of cybernetics, Norbert Wiener, to the equally pugnacious antivivisectionists, and "a thirty-three-year-old married Englishwoman with a hidden past and the habit of wearing no underwear" to, 211 pages later, the Elgin marbles. Burke again makes use of "gateways" in his narrative, a system of numeric codes that link distant strands within the text into a literary subspace, allowing readers to skip about throughout the book, as if Burke's caperings aren't entertainment enough, though it does drive home why Burke is so pleased that the word "web" has gained such currency. There are vague rumblings at the beginning of this book about a new system of knowledge gathering, sowing democracy and enfranchising the uneducated in its wake, that Burke will introduce, in which semi-intelligent computer software helps weed through the information glut unleashed by theInternet. That would suggest undermining the very serendipity and interactivity that enthralls him so, and he wisely doesn't mention it again after the introduction. Burke is in a league alone when it comes to freewheeling intellectual curiosity and mapping nature's strange designs.
Read More Show Less

Product Details

  • ISBN-13: 9780684859347
  • Publisher: Simon & Schuster
  • Publication date: 6/1/1999
  • Pages: 288
  • Product dimensions: 6.44 (w) x 9.58 (h) x 0.94 (d)

Meet the Author

James Burke has written seven books, including the bestselling Connections and The Day the Universe Changed. He contributes a monthly column to Scientific American and serves as director, writer and host of the television series Connections 3, which airs on the Learning Channel. He lives in England, France and airplanes.
Read More Show Less

Read an Excerpt

Introduction

Change comes so fast these days that the reaction of the average person recalls the depressive who takes some time off work and heads for the beach. A couple of days later his psychiatrist gets a postcard from him. The message on the card reads: "Having a wonderful time. Why?"

Innovation is so often surprising and unexpected because the process by which new ideas emerge is serendipitous and interactive. Even those directly involved may be unaware of the outcome of their work. How, for instance, could a nineteenth-century perfume-spray manufacturer and the chemist who discovered how to crack gasoline from oil have foreseen that their products would come together to create the carburetor? In the 1880s, without the accidental spillage of some of the recently invented artificial colorant onto a petri-dish culture that revealed to a German researcher named Ehrlich that the dye preferentially killed certain bacilli, would Ehrlich have become the first chemotherapist? If the Romantic movement's concept of "nature-philosophy" had not suggested that nature evolves through the reconciliation of opposing forces, would Oersted have sought to "reconcile" electricity and magnetism and discovered the electromagnetic force that made possible modern telecommunications?

Small wonder, then, that the man and woman in the street are left behind in all this, if the researchers themselves don't get the point. But given the conditions under which science and technology work, how else could it be? At last count there were more than twenty thousand different disciplines, each of them staffed by researchers straining to replace what they produced yesterday.

These noodlingworld-changers are spurred on by at least two powerful motivators. The first is that you are more likely to achieve recognition if you make your particular research niche so specialist that there's only room in it for you. So the aim of most scientists is to know more and more about less and less, and to describe what it is they know in terms of such precision as to be virtually incomprehensible to their colleagues, let alone the general public.

The second motivator is the CEO. Corporations survive in a changing world only by encouraging their specialists to generate change before somebody else does. Winning in the marketplace means catching the competition by surprise. Not surprisingly, this process also surprises the consumer, and nowhere so frequently today as in the world of electronics, where by the time the user gets around to reading the manual, the gizmo to which it refers is obsolete.

We live in this permanently off-balance manner because of the way knowledge has been generated and disseminated for the last 120,000 years. In early Neolithic times the requirement to teach the highly precise, sequential skills of stone-tool manufacture demanded a similarly precise, sequential use of sounds and is thought to have given rise to language. The sequential nature' of language facilitated description of the world in similarly precise terms, and in due course a process originally developed for chipping pieces off stone became a tool for chipping pieces off the universe. This reduction, of reality to its constituent parts is at the root of the view of knowledge known as "reductionism," from which science sprang in the seventeenth-century West. Simply put, scientific knowledge comes as the result of taking things apart to see how they work.

Over millennia, this way of doing things has tended to subdivide knowledge into smaller and more specialist segments. For example, in the past hundred years or so, the ancient discipline of botany has fragmented and diversified to become biology, organic chemistry, histology, embryology, evolutionary biology, physiology, cytology, pathology, bacteriology, urology, ecology, population genetics and zoology.

There is no reason to suppose that this process of proliferation and fragmentation will lessen or cease. It is at the heart of what, since Darwin's time, has been called "progress." If we live today in the best of all possible materialist worlds, it is because of the tremendous strides made by specialist research that have given us everything from more absorbent diapers to linear accelerators. We in the technologically advanced nations are healthier, wealthier, more mobile, better-informed individuals than ever before in history, thanks to myriad specialists andu the products of their pencil-chewing efforts.

However, the corollary to a small minority knowing more and more about less and less is a large majority knowing less and less about more and more. In the past this has been a relatively unimportant matter principally because for most of history the illiterate majority (hard-pressed enough just to survive) has been unaware that the problem existed at all. Technology was in such limited supply that there was only enough to share it among a few elite decision-makers.

It is true that over time, as the technology diversified, knowledge slowly diffused outward into the community via information media such as the alphabet, paper, the printing press and telecommunications: But at the same time these systems also served to increase the overall amount of specialist knowledge. What reached the general public was usually either out-of-date or no longer vital to the interests of the elite. And as specialist knowledge expanded, so did the gulf between those who had information and those who did not.

Each time there was a major advance in the ability to generate, store or disseminate knowledge, it was followed by an "information surge" and with it a sudden acceleration in the level of innovation that dramatically enhanced the power of the elites. But sooner or later the same technology reached enough people to undermine the status quo. The arrival of paper in thirteenth-century Europe strengthened the hand of church and throve, but at the same time created a merchant class that would ultimately question their authority. The printing press gave Rome the means to enforce obedience and conformity, then Luther used it to wage a propaganda war that ended with the emergence of Protestantism. In the late nineteenth century, when military technology made possible conflicts in which hundreds of thousands died, and manufacturing technology generated untenable working and living conditions for millions of factory workers, radicals and reformers were aided in their efforts by new printing techniques cheap enough to spread their message of protest in newspapers and pamphlets.

By the mid twentieth century scientific and technological knowledge far outstripped the ability of most people, even the averagely well-informed, to comprehend it. The stimulus, of the Cold War brought advances in computer technology that seemed likely to place unprecedented power in the hands of economic and political power blocs. There was talk of "Big Brother" government, rule by multinational corporations, the central databases that would hold personal files on every individual, and the creeping homogenization of the human race into one giant "global village." Unchecked state and corporate industrialization finally began to generate the first visible signs of global warming, runaway pollution decimated the animal population and the tropical forests went down before fire and axe at an alarming rate.

However, at the same time, the failing cost of computer and telecommunications technology also began to make it possible for these developments to be discussed in an, unprecedentedly large public forum. And the more we learned about the world through television and radio, the more it became clear that urgent measures were needed to preserve its fragile ecosystems and its even more fragile cultural diversity. At the end of the twentieth century the emergence of the ubiquitous Internet and affordable wireless technology offered the opportunity for millions of individuals to think, of becoming involved.

However, the culture of scarcity, with which we have lived for millennia has not prepared us well for the responsibilities technology will force on us in the next few decades. Reductionism, representative democracy and the division of labor have tended to leave such matters in the hands of specialists who are, increasingly, no more aware of the ramifications of their work than anybody else.

The result is that national and international institutions are coming under unprecedented stress as they try to apply their obsolete mechanisms to twenty-first-century problems. In Britain recently a case was brought against an individual which rested on the fifteenth-century meaning of the word "obscene." Medical etiquette has changed little since 1800. In some places science and religion are in conflict over the definition of life.

Western institutions function as if the world had not changed since they were established to deal with the specific problems of the time. Fifteenth-century nation-states, emerging into a world without telecommunications, developed representative democracy; seventeenth-century explorers in need of financial backing invented the stock market; in the eleventh century the influx of Arab knowledge triggered the invention of universities to process the new data for student priests.

In the coming decades it is likely that many social institutions will attempt to adapt by becoming virtual, bringing their services directly to the individual much in the way that banks have already begun to. But their new accessibility will in turn likely subject them to proliferating and diversifying demands that will change how they work and make them redefine their purpose. In education, the old reductionist reliance on specialism and testing by repetition will have to give way to a much more flexible definition of ability. As machines increasingly take over the tasks that once occupied a human lifetime, specialist skills may take on a merely antiquarian value. New ways will have to be found to assess intelligence in a world in which memory and experience seem no longer of value (again, this is nothing new: the alphabet and later the printing press both presented the same perceived threat).

When a corporate workforce becomes scattered across the country, or the globe, in thousands of individual homes or groups, and deals direct with millions of customers, the value of communication skills is likely to outweigh that of most others. Such ability may be possessed by people who would previously have been thought unqualified to work for the corporation, because in the old world they would have been too young, or too old, or too distant, for example. A virtual education system will have to deal with problems such as a multicultural global student body bringing very diverse experience, attitudes and aims to the class. In terms of international law, recent cases involving copyright or pornography reveal how complex such legal problems are likely to become.

This book does not attempt directly to address any of these problems. Rather, it suggests an approach to knowledge perhaps more attuned to the needs of the twenty-first century as described above. Some readers will no doubt see this approach as more evidence of the "dumbing-down" of recent years. But the same was said about the first printing press, newspapers, calculators and the removal of mandatory Latin from the curriculum.

In its fully developed form, the "webbed" knowledge system introduced here would be inclusive, not exclusive. Modern interactive networked communications systems married to astronomically large data storage capability ought to ensure that at times of change nothing need be lost. No subject or skill will be too arcane for its practitioners to pursue when the marketplace for their skills is planetwide.

Also, no external memory device from alphabet to laptop seems to have degraded human mental abilities by its introduction. Rather these abilities have been augmented each time by the new tools. Some skills, such as rote memory, become less widely used, but there seems to be no evidence that the capability for them disappears. In many cases machines also take over routine work, freeing individuals to use their skills at higher levels.

The latest interactive, semi-intelligent technologies seem likely to make this possible on an unprecedented scale. They also bring to an end a period of history in which the human brain was constrained by limited technology to operate in a less-than-optimal way, since the brain appears not to be designed to work best in the linear, discrete way promoted by reductionism. The average healthy brain has more than a hundred billion neurons, each communicating with others via thousands of dendrites. The number of potential ways for signals to go in the system is said to be greater than the number of atoms in the universe. In matters as fundamental as recognition it seems that the brain uses some of its massive interconnectedness to call on many different processes at once to deal with events in the outside world, so as quickly to identify a potentially dangerous pattern of inputs.

It is this pattern-recognition capability that might prove to be the most useful attribute of a webbed knowledge system driven by the semi-intelligent interactive systems now being developed. As this book hopes to show, learning to identify the pattern of connections between ideas, people and events is the first step toward understanding the context and relevance of information. So the social implications of webbed knowledge systems are exciting, since they will make it easier for the average citizen to become informed of the relative value of innovation. After all, it is not necessary to understand the mathematics of radioactive decay to make a decision about where to site a nuclear power plant. As I hope you will see, this approach to knowledge may be one way to enfranchise those millions who lack what used to be called formal education and to move us toward more participatory forms of government.

I would not pretend that what follows is more than a first exercise, a number of linked storylines intended to introduce the reader to the kind of information infrastructures we may begin to use in the next few decades. But I hope they will introduce the reader to a new, more relevant way of looking at the world, because in one way or another, we're all connected.

James Burke London 1999

Copyright © 1999 by London Writers

Chapter 1

Feedback

This book takes a journey across the vast, interconnected web of knowledge to offer a glimpse of what a learning experience might be like in the twenty-first century once we have solved the problem of information overload.

In the past when technology generated information overload the contemporary reaction was much the same as it is today. On the first appearance of paper in the medieval West, the English bishop Samson of St. Alban's complained that because paper would be cheaper than animal-skin parchment people would use paper to write too many words of too little value, and since paper was not as durable as parchment, paper-based knowledge would in the long run decay and be lost. When the printing press was developed in the fifteenth century it was said that printed books would make reading and writing "the infatuation of people who have no business reading and writing." Samuel Morse's development of the telegraph promised to link places as far apart as Maine and Texas, triggering the reaction: "What have Maine and Texas to say to each other?" The twentieth-century proliferation of television channels has led to concerns about "dumbing-down."

The past perception that new information technologies would have a destabilizing social effect led to the imposition of controls on their use. Only a few ancient Egyptian administrators were permitted to learn the skills of penmanship. Medieval European paper manufacture was strictly licensed. The output of sixteenth-century printing presses was subject to official censorship by both church and state. The new seventeenth-century libraries were not open to the public. Nineteenth-century European telegraphs and telephones came under the control of government ministries.

The problem of past information overload has generally been of concern only to a small number of literate administrators and their semiliterate masters. In contrast, twenty-first-century petabyte laptops and virtually free access to the Internet may bring destabilizing effects of information overload that will operate on a scale and at a rate well beyond anything that has happened before. In the next few decades hundreds of millions of new users will have no experience in searching the immense amount of available data and very little training in what to do with it. Information abundance will stress society in ways for which it has not been prepared and damage centralized social systems designed to function in a nineteenth-century world.

Part of the answer to the problem may be an information-filtering system customized to suit the individual. The most promising of the systems now being developed will guide users through the complex and exciting world of information without their getting lost. This book provides an opportunity for the reader to take a practice run on such a journey. The journey (the book) begins and ends with the invention of the guidance system itself -- the semi-intelligent agent.

There are several types of agent in existence acting like personal secretaries in a variety of simple ways: filtering genuine e-mail from spam, running a diary, paying bills and selecting entertainment. In the near future agents will organize and conduct almost every aspect of the individual's life. Above all they will journey across the knowledge webs to retrieve information, then process and present it in ways customized to suit the user. In time they will act on behalf of their user because they will have learned his or her preferences by learning from the user's daily requirements.

In the search to develop semi-intelligent agents, one of the most promising systems (and the one which starts this journey) may be the neural network. Such a network consists of a number of cells each reacting to signals from a number of other cells that in turn fire their signals in reaction to input from yet other cells. If input signals cause one cell to fire more frequently than others, its input to the next cell in the series will be given greater weighting. Since cells are programmed to react preferentially to input from cells that fire frequently rather than from those that fire rarely, the system "learns" from experience. This is thought to be similar to the way learning operates in the human brain, where the repetition of a signal generated in response to a specific experience can cause enlargement in the brain cell's synapses.

The synapse is the part of the cell that releases transmitter chemicals that cross the gap to the next cell. If sufficient chemicals arrive on the other side, they generate an impulse. If enough of these signals are generated in the target cell, they cause its synapses to release chemicals in turn, and "pass the message on." A cell with larger synapses, releasing larger amounts of chemical, is therefore more likely to cause another cell to fire. Networks of such frequently firing cells may constitute the building blocks of memory.

This theory of neuronal interaction was first proposed in 1943 by two American researchers, Walter Pitts and Warren McCulloch, who also suggested that such a feedback process might result in purposive behavior when linking the senses with the brain and muscles if the result of the interaction were to cause the muscles to act to reduce the difference between a condition in the real world as perceived by the senses and the condition as desired by the brain.

Pitts and McCulloch belonged to a small group of researchers calling itself the "Teleological Society," another of whose members was the man who invented the name for this neural feedback process. He was Norbert Wiener, and he was the first to see the way in which feedback might work in a machine, during his research on antiaircraft artillery systems during World War II. Wiener was a rotund, irascible, cigar-chomping MIT professor of math who prowled what he described as the "frontier areas" between the scientific disciplines. Between biology and engineering wiener developed a new discipline to deal with feedback processes. He called the new discipline "cybernetics." Wiener recognized that feedback devices are information-processing systems receiving information and acting upon it. When applied to the brain this new information-oriented view was a fundamental shift away from the entirely biological paradigm that had ruled neurophysiology since Freud, and it was to affect all artificial-intelligence work from then on.

Wiener first applied his feedback theory early in World War II, when he and a young engineer named Julian Bigelow were asked to improve the artillery hit rate. At the beginning of the war the problem facing antiaircraft gunners was that as the speed of targets increased (thanks to advances in engine and airframe technology) it became necessary to be able to fire a shell some distance ahead of a fast-moving target in order to hit it. Automating this process involved a large number of variables: wind, temperature, humidity, gunpowder charge, length of gun barrel, speed and height of target, and many others. Wiener used continuous input from radar tracking systems to establish the recent path of the target and use that path to predict what the target's likely position would be in the immediate future. This information would then be fed to the gun-moving mechanisms so that aiming-off was continually updated.

The system had its most outstanding successes in 1944; when British and American gunners shot down German flying bombs with fewer than one hundred rounds per hit. This was an extraordinary advance over previous performance, estimated at one hit per twenty-five hundred rounds. In 1944, during the last four weeks of German V-1 missile attacks on England, the success rate improved dramatically. In-the first week, 24 percent of targets were destroyed; in the second, 46 percent; in the third, 67 percent; and in the fourth, 79 percent. The last day on which a large number of V-1s were launched at Britain, 104 of the missiles were detected by early-warning radar, but only four reached London. Antiaircraft artillery destroyed sixty-eight of them.

Early in his work on the artillery project Wiener had frequent discussions with a young physiologist named Arturo Rosenbleuth, who was interested in human feedback mechanisms that act to ensure precision in bodily movement. For the previous fifteen years Rosenbleuth had worked closely with Walter Cannon, professor of physiology at Harvard. Earlier in the century Cannon had invented the barium meal, which was opaque to X-rays. When ingested by a goose the barium revealed the peristaltic waves that occurred in the bird's stomach when it was hungry. Cannon observed that hunger seemed to precipitate the onset of these waves. He then observed that when a hungry animal was frightened the waves stopped.

This led to Cannon's ground-breaking studies, of the physical effects of emotion. He discovered that when an animal was disturbed its sympathetic nervous system secreted into the bloodstream a chemical that Cannon named "sympathin." This chemical counteracted the effects of the disturbance and returned the animal's body systems to a state of balance. Cannon named the balancing process "homeostasis." In 1915 Cannon discovered that the principal body changes affected by the sympathetic system were those involved in fight, sexual activity or flight. In such situations sugar flowed from the liver to provide emergency energy and blood shifted from the abdomen to the heart, lungs and limbs. If the body were wounded, blood clotting occurred more rapidly than usual. In 1932 Cannon published a full-scale account of his research titled The Wisdom of the Body.

What had initially triggered cannon's interest in homeostatic mechanisms was the work of the man to whom Cannon dedicated the French edition of his book. He was an unprepossessing but eminent French physiologist named Claude Bernard, who had started his working life as a pharmacist's assistant in Beaujolais, where his father owned a small vineyard. After being forced to give up his early schooling for lack of funds, Bernard took up writing plays. He produced first a comedy and then a five-act play, which he took to Paris in 1834 with the intention of making a career in the theater. Fortunately for the future health of humankind Bernard was introduced to an eminent theatrical critic, Saint-Marc Girardin, who read the play and advised Bernard to take up medicine.

At first Bernard planned to be a surgeon, but becoming dissatisfied with the general lack of physiological data he began to gather his own data by experimenting on animals. By 1839 his dexterity in dissection had brought him to the attention of the great physiologist François Magendie, who appointed him as assistant. One winter morning, in 1846 some rabbits were brought to Magendie's lab for dissection and Bernard noticed that their urine was clear and acidic. As every nineteenth-century French winemaker knew, the urine of rabbits is usually turbid and alkaline. Bernard realized that the rabbits had not been fed and theorized that since the urine of carnivores is clear, the hungry, herbivorous rabbits must have been living on their fat. When he fed grass to the rabbits their urine returned to its normal alkaline turbidity. He double-checked with an experiment on himself. After twenty-four hours subsisting only on potatoes, cauliflower, carrots, green peas, salad, and fruit, Bernard's own urine went turbid and alkaline. Bernard then starved the rabbits, fed them boiled beef and dissected them to find out what had happened. He saw a milklike substance (he took it to be emulsified fat) that had formed at the point where the rabbit's pancreatic juice was pouring into the stomach: There was clearly some link between the juice and the emulsification of the fats.

Two years later he discovered the glycogenic function of the liver, which injects glucose into the blood. It was this discovery thai led to Bernard's greatest contribution to the sum of human knowledge, because he saw that the function of the liver and the pancreas (and perhaps other systems, too) was to maintain the body's equilibrium. He summed up his research: "All the vital mechanisms; however varied they may be, have only one object, that of preserving constant the conditions of life in the inner environment." Follow-up research on the pancreas led an English researcher, William Bayliss, to coin the phrase that Cannon would use as his book title: "the wisdom of the body."

Not everybody was happy with Bernard's work, especially when he designed an oven in which to cook animals alive. An American doctor, Francis Donaldson, who attended Bernard's lectures in 1851, wrote: "It was curious to see walking about the amphitheater of the College of France dogs and rabbits, unconscious contributors to science, with five or six orifices in their bodies from which at a moment's warning, there could be produced any secretion of the body, including that of the several salivary glands, the stomach, the liver, and the pancreas."

Bernard was well aware of public opposition to vivisection but defended it: "The science of life is like a superb salon resplendent with light which one can enter only through a long and ghastly kitchen." Alas, Bernard's wife was unable to take the heat. After leaving him in 1869 she went in search of the antivivisection activists to whom she had been sending regular contributions.

She did not have far to go. In Paris a fanatical young vegetarian Englishwoman named Anna Kingsford, the owner of the Lady's Own Paper, had come to France to study medicine. Kingsford became well-known at the medical school for refusing to let her professors vivisect during the lessons she attended and for demonstrating against the practice. Kingsford's lecture halls were close to Bernard's labs, and she became so obsessed by his work that she set about directing all her energies toward killing him with thought waves. Bernard died only few weeks after she had begun to concentrate her mental energies on him, convincing her that she had been the instrument of divine will. Kingsford also claimed to have been responsible for the death of another vivisector, Paul Bert. However her efforts to do the same to Louis Pasteur failed.

Legislation to protect animals from ill treatment took a long time to reach the statute books, even in England, where the first such laws were passed. In 1800 the first bill to outlaw bull-baiting had ignominiously failed in its passage though the Houses of Parliament, opposed by George Canning (later prime minister), who claimed that bull-baiting "inspired courage and produced a nobleness of sentiment and elevation of mind....Putting a stop to bull-baiting was legislating against the spirit and genius of almost every country and age." However, in 1821 Dick Martin, MP for Galway, forced through a bill to protect horses and cattle against ill treatment. It was the first law of its kind in any country. In 1824 the Society for the Prevention of Cruelty to Animals was formed at the unfortunately named Old Slaughter Coffee House in London. The publication in 1859 of Darwin's Origin of Species seemed to strengthen the relationship between humans and animals and support the animal-defense argument. In 1876 the Victoria Street Society against Vivisection was formed with Lord Shaftesbury as chairman. The same year a bill was passed to prevent the vivisection of dogs, cats, mules, horses and asses. By the late nineteenth century the animal-defense movement had spread throughout the Western world and given birth to hundreds of local groups known as Humane Societies, in spite of the fact that the name more properly belonged to earlier humanitarian work of an entirely different nature.

The Royal Humane Society was founded in London in 1774 largely as the result of the efforts of Dr. William Hawes to promote knowledge of artificial-respiration techniques. Hawes based his ideas on the translation of a paper by the Amsterdam Society for the Recovery of the Apparently Drowned. The society had been founded in 1767 after several cases of successful resuscitation had been reported in Switzerland. In the nineteenth century interest in drowning became acute with the spectacular increase in cargo tonnage and passenger traffic on the high seas following the spread of industrialization. As the number of ships rose so did the number of shipwrecks and deaths.

From time to time the Royal Humane Society awarded a gold medallion for outstanding feats of bravery, and in 1838 the recipient hit the front pages because she was a slightly built, twenty-two-year-old woman. On the night of February 6, a paddle steamer, the Forfarshire, battling through a gale en route from Hull to Dundee with a full cargo and sixty-three passengers, sprang a leak in her boiler. The captain decided to take shelter among the Farne Islands off the coast of Northumberland. During this maneuver the ship hit the rocks and broke in two, and all but thirteen passengers and crew were drowned. The survivors, exposed to the full force of the storm, included a mother and two children. Overnight the two children and an adult died. At five o'clock the next morning Grace Darling, daughter of the local lighthouse keeper, caught sight of the wreck and the survivors clinging to the rocks. Grace and her father rowed to the rescue, struggling through mountainous sea in a small open boat. The drama was reported in the newspapers and Grace became an instant national hero. Alas, she was to die four years later from tuberculosis. Meantime she had inspired the public to offer massive financial and political support for the eventual establishment of the Royal National Lifeboat Institution, in 1854.

That same year came another highly publicized loss at sea. The USS San Francisco, an American troopship carrying hundreds of soldiers, foundered in an Atlantic hurricane. The secretary of the navy sent for Matthew Maury, the only man in America who would be able to tell where to look for survivors. After studying his wind and current charts, Maury pinpointed the spot and the survivors were found in the water.

Maury was the fourth son of a Huguenot-English family long settled in Virginia (his grandfather had taught Thomas Jefferson), and he had joined the U.S. Navy in 1825. It was during a voyage to South America that Maury became interested in finding faster ways to cross the ocean. On his return in 1834 he took leave and wrote his first work on navigation. In 1839 Maury published a series of articles in the Southern Literary Messenger, one of which advocated the establishment of a naval school. It would become the U.S. Naval Academy at Annapolis.

In 1847 Maury issued the first of several charts and then, in 1851, Explanations and Sailing Directions to Accompany the Wind and Current Charts. At the instigation of the U.S. government, copies of the charts and Sailing Directions were distributed free to all masters of vessels on the understanding that they would keep a full log of journeys and forward these logs to Maury, in Washington. Logs were to include temperature of air and water, direction of wind and currents, and air pressure. Captains were also required to throw overboard (at given intervals) a bottle containing a piece of paper carrying the ship's position and the date. They were also to pick up any such bottles they came across and note all details in their logs. In return for these services masters would receive free copies of Maury's further work. Over eight years Maury collected and processed data on many millions of observations, as a result of which he was able to identify faster sailing routes. One ship's master following Maury's suggested route from New York to Rio de Janeiro halved the usual journey time. It was reckoned that Maury's "Path-of-Minimum-Time" routes saved American shipping forty million dollars a year.

In 1853 Maury crowned his career when he persuaded sixteen countries (among them the United States, Britain, Belgium, Holland, Russia, France, Norway, Denmark and Portugal) to meet in Brussels for the first International Meteorological Congress "to plan an uniform system of meteorological observation at sea, and to agree a plan for the observation of the winds and currents of the oceans with a view to improving navigation and to enrich our knowledge of the laws which govern those elements." Not long after he had returned from Brussels, Maury received a letter from a retired paper-manufacturing millionaire named Cyrus W. Field, who was seeking advice on the ideal route for a transatlantic submarine telegraph cable.

Submarine cables had already been laid successfully in the relatively shallow waters between England and Holland, Scotland and Ireland, but the Atlantic represented a formidable challenge. Field had managed to get a favorable charter from the British government for a fifty-year monopoly on any cable laid between Newfoundland and Ireland. The British also offered to provide a cable-laying ship as well as a generous advance on income from telegraph messages. Field then spent two years laying a cable between Newfoundland and the North American mainland (stockholders in the company included such luminaries as Lady Byron and Thackeray). When the link was completed Field wrote to Maury to solicit his views on the best route out of Newfoundland toward Europe.

Maury reported that soundings revealed a shallow "telegraph plateau" across much of the North Atlantic, and in 1857 work began on laying the cable. After a few hundred miles had been laid the cable snapped. Three more attempts were made and on August 5, 1858, 1,850 miles of copper wire connected Valencia, Ireland, with Trinity Bay, Newfoundland; and traffic began with an inaugural message from Queen Victoria to President Buchanan. At the celebration dinner in New York Field said modestly: "Maury furnished the brains, England gave the money, and I did the work." Then the cable failed again. In 1865 they found the parted ends, spliced them and the work was done. The U.S. Congress voted Field a gold medal.

Field had also written to the man whose work had inspired the whole venture: Samuel Morse, inventor of the most successful form of telegraph. Morse's advantages over other telegraphers were his key and the Morse Code, which he demonstrated before Congress in 1844. The idea had come to him in the autumn of 1832 during a voyage back to the United States from France. Morse first learned what he needed to know about the principles of electricity and one of his friends, Alfred Vail, provided the finance and hardware (Vail's father had a machine shop in New Jersey). Vail also suggested what would later become known as the Morse Code.

At this time Morse was a well-known artist, professor of art at New York University, and had just spent three years in Europe studying and painting. Morse was a strange man given to apocalyptic patriotic views. He had been brought up as a strict Calvinist by his father Jedidiah, America's foremost geography scholar, who had earlier led the Old Calvinist "Great Awakening" crusade against liberal theology. Like his father, Morse looked forward to the triumph of American culture and believed that only an elite could lead the country to salvation. Morse was also extremely xenophobic. At one point he painted a picture of the pope conspiring to arm American Catholics, provoke disorder, rig elections and elect foreigners to public office. Morse also helped publish a book about Maria Monk, a woman who claimed to have been a nun in Montreal, where she also claimed to have witnessed unnatural sexual acts performed by clergy and to have seen crypts filled with the corpses of illegitimate children. In the end it was revealed that Monk (rumored to have had a romantic affair with Morse) had escaped from a mental institution.

Morse believed that art was a tool placed in his hands by God to be used to save Protestant America. He believed that the millennium was imminent, and that when it came America would carry the empire of peace to the world. It was therefore essential to prepare American art for the great day. Morse founded the National Academy of Arts and Design in 1826 and was its president until 1845. The aim of the academy was to foster American artistic talent so that American genius could take its rightful place in the world and inculcate true Protestant virtues in other Americans.

In 1829 Morse decided to visit Europe to study artistic masterworks in preparation for what he hoped would be his greatest triumph, the commission to paint the four remaining murals for the Rotunda of the Capitol Building in Washington D.C. To this end, while in Paris in 1831, he painted the giant Gallery of the Louvre. The painting reproduced in miniature thirty-eight Louvre masterpieces. Morse's aim was to show that while the classical past was worthy of study it should not be the subject of slavish emulation by American artists, who, like the artist shown in the Louvre painting (Morse himself), could learn from the Old Masters and then develop their own distinctively American style. On his return the Louvre painting was put on exhibition in New York and was a disastrous flop. The commission for the Rotunda murals went to other artists. Morse turned to the telegraph as an alternative tool with which to make Protestant America great. Communications technology would be an instrument of Divine Will, redeeming America by transmitting messages of peace and love. At his demonstration in Congress in 1844 Morse's first transmitted message echoed these beliefs: "What hath God wrought!"

Morse had learned his art at the feet of Washington Allston, the most completely Romantic American painter, whom he had met in Boston in 1810 and with whom he became life-long friends. Only a year after their meeting Allston inspired Morse to attempt the first of his grand historical American scenes, Landing of the Pilgrims at Plymouth. That same year Morse joined Allston and his wife on his first trip to Europe. Allston was a good-looking Harvard-educated gentleman from South Carolina who on the death of his step-father in 1801 had sold the family property to finance a career in painting. On his earlier visit to London Allston had studied with Benjamin West, the president of the Royal Academy, and then in 1804 moved on via Paris to Rome. There he met Washington Irving, who later wrote: "I do not think I have ever been more completely captivated on a first acquaintance. He was of a light and graceful form, with large blue eyes, and black, silken hair waving and curling around a pale, expressive countenance. A young man's intimacy took place immediately between us, and we were much together during my brief sojourn at Rome....We visited together some of the finest collections of paintings, and he taught me how to visit them to the most advantage, guiding me always to the masterpieces, and passing by the others without notice." Allston's Italian Landscape shows the profound effect of Italy on his work. His fresh New England eye was overwhelmed by the light, the color, the ancient ruins, the landscape dotted with hilltop villages, the rich mingling of Renaissance, medieval and classical architecture and the pastoral nature of Italian peasant life.

In 1805 Allston met and painted the English Romantic poet Samuel Taylor Coleridge, whom he would later recognize as his greatest intellectual mentor. At the time of their meeting Coleridge was suffering the aftereffects of his failure to give up opium. Aged thirty-three, with "Kubla Khan" and the "Ancient Mariner" poems behind him, Coleridge was already famous. He was also an alcoholic, deeply in debt, unhappily married with three children, had failed in a venture to set up a utopian settlement on the banks of the Susquehanna in Pennsylvania, and was extremely hypochondriac (he coined the word "psychosomatic").

It was partly to try to wean himself from opium (and his penchant for taking it in brandy), and partly to get away from his wife, that in 1804 Coleridge had run away to Malta. There, thanks to an influential acquaintance, he landed the job of secretary to the British civil commissioner, Alexander Ball. The post also included free food and accommodation in the commissioner's palace in Valetta, the island's capital. Coleridge's workload was light and consisted principally of rewriting Ball's dispatches to London. Although Coleridge complained incessantly about his health, the nightmare of withdrawal symptoms, the dull company and his inability to write new poems, he enjoyed the climate and the countryside and managed to produce some of his best prose. He also began to feel the first stirrings of mortality: "I had felt the Truth; but never saw it before clearly; it came upon me at Malta, under the melancholy dreadful feeling of finding myself to be Man, by a distinct division from Boyhood, Youth, 'Young Man.' Dreadful was the feeling -- before that, life had flown on so that I had always been a Boy, as it were -- and this sensation had blended in all my conduct." When his friends William and Mary Wordsworth saw him on his return to England, they were to remark that he had changed for the worse.

Coleridge saw from the dispatches he was editing that he had arrived in Malta at a critical time. Ball was arguing the strategic importance of the island now that Napoleon had given up Louisiana, lost Santo Domingo and would inevitably turn his attention to the Mediterranean. Ball also suggested to the British government that Algiers, Tunis and Tripoli were ripe for colonization and "they are capable of growing all Our colonial produce." He also argued that though both Russia and France wanted Malta they should not be allowed to take it. At the time, the island was a hotbed of intrigue. The Maltese were agitating for independence, Russian and French spies were imagined to be everywhere and there was an American naval squadron on station commanded by Commodore Edward Preble. Among Preble's officers was the young Stephen Decatur, leader of the daring and successful 1804 raid on Tripoli harbor to destroy the American frigate Philadelphia, which had run aground and been captured during the American-Tripolitanian War. During a brief trip to Sicily Coleridge met and dined with both intrepid Americans, and for years later regaled friends with tales of their exploits.

Coleridge's employer, Rear Admiral Alexander Ball, had joined the British Navy at the age of twelve. This event, he told Coleridge, had been inspired by reading Robinson Crusoe. Ball had the air more of an academic than a sailor, bookish and thoughtful. After serving in the Caribbean, America and Newfoundland, in 1783 he took a year off and went to France to study the language. At one point there, during a visit to St. Omer, he met another young captain with whom his fate was to be bound up, in spite of the fact that on this occasion each expected the other to make the required formal call, so neither did so. Ball then served in the English Channel, went again to Newfoundland, was stationed off the French Coast and in 1798 was posted to the Mediterranean where he was to meet the young captain with whom he had failed to exchange courtesies in St. Omer. At this time Britain was expecting to be invaded by Napoleon, and much of the British fleet was patrolling outside French harbors in the English Channel and on the French Atlantic coast. Hearing a rumor that Napoleon was assembling a Mediterranean fleet in Toulon, the British also sent a fleet to blockade that port.

In April the Toulon blockade fleet flee

Read More Show Less

Table of Contents

Introduction 11
How to Use This Book 19
1 Feedback 21
2 What's in a Name? 47
3 Drop the Apple 70
4 An Invisible Object 94
5 Life Is No Picnic 117
6 Elementary Stuff 144
7 A Special Place 168
8 Fire from the Sky 193
9 Hit the Water 216
10 In Touch 241
Bibliography 263
Index 271
Read More Show Less

First Chapter

Chapter 1

Feedback

This book takes a journey across the vast, interconnected web of knowledge to offer a glimpse of what a learning experience might be like in the twenty-first century once we have solved the problem of information overload.

In the past when technology generated information overload the contemporary reaction was much the same as it is today. On the first appearance of paper in the medieval West, the English bishop Samson of St. Alban's complained that because paper would be cheaper than animal-skin parchment people would use paper to write too many words of too little value, and since paper was not as durable as parchment, paper-based knowledge would in the long run decay and be lost. When the printing press was developed in the fifteenth century it was said that printed books would make reading and writing "the infatuation of people who have no business reading and writing." Samuel Morse's development of the telegraph promised to link places as far apart as Maine and Texas, triggering the reaction: "What have Maine and Texas to say to each other?" The twentieth-century proliferation of television channels has led to concerns about "dumbing-down."

The past perception that new information technologies would have a destabilizing social effect led to the imposition of controls on their use. Only a few ancient Egyptian administrators were permitted to learn the skills of penmanship. Medieval European paper manufacture was strictly licensed. The output of sixteenth-century printing presses was subject to official censorship by both church and state. The new seventeenth-century libraries were not open to the public. Nineteenth-century Europe to suit the user. In time they will act on behalf of their user because they will have learned his or her preferences by learning from the user's daily requirements.

In the search to develop semi-intelligent agents, one of the most promising systems (and the one which starts this journey) may be the neural network. Such a network consists of a number of cells each reacting to signals from a number of other cells that in turn fire their signals in reaction to input from yet other cells. If input signals cause one cell to fire more frequently than others, its input to the next cell in the series will be given greater weighting. Since cells are programmed to react preferentially to input from cells that fire frequently rather than from those that fire rarely, the system "learns" from experience. This is thought to be similar to the way learning operates in the human brain, where the repetition of a signal generated in response to a specific experience can cause enlargement in the brain cell's synapses.

The synapse is the part of the cell that releases transmitter chemicals that cross the gap to the next cell. If sufficient chemicals arrive on the other side, they generate an impulse. If enough of these signals are generated in the target cell, they cause its synapses to release chemicals in turn, and "pass the message on." A cell with larger synapses, releasing larger amounts of chemical, is therefore more likely to cause another cell to fire. Networks of such frequently firing cells may constitute the building blocks of memory.

This theory of neuronal interaction was first proposed in 1943 by two American researchers, Walter Pitts and Warren McCulloch, who also suggested that such a feedback pr ocess might result in purposive behavior when linking the senses with the brain and muscles if the result of the interaction were to cause the muscles to act to reduce the difference between a condition in the real world as perceived by the senses and the condition as desired by the brain.

Pitts and McCulloch belonged to a small group of researchers calling itself the "Teleological Society," another of whose members was the man who invented the name for this neural feedback process. He was Norbert Wiener, and he was the first to see the way in which feedback might work in a machine, during his research on antiaircraft artillery systems during World War II. Wiener was a rotund, irascible, cigar-chomping MIT professor of math who prowled what he described as the "frontier areas" between the scientific disciplines. Between biology and engineering wiener developed a new discipline to deal with feedback processes. He called the new discipline "cybernetics." Wiener recognized that feedback devices are information-processing systems receiving information and acting upon it. When applied to the brain this new information-oriented view was a fundamental shift away from the entirely biological paradigm that had ruled neurophysiology since Freud, and it was to affect all artificial-intelligence work from then on.

Wiener first applied his feedback theory early in World War II, when he and a young engineer named Julian Bigelow were asked to improve the artillery hit rate. At the beginning of the war the problem facing antiaircraft gunners was that as the speed of targets increased (thanks to advances in engine and airframe technology) it became necessary to be able to fire a shell some distance ahead of a fa st-moving target in order to hit it. Automating this process involved a large number of variables: wind, temperature, humidity, gunpowder charge, length of gun barrel, speed and height of target, and many others. Wiener used continuous input from radar tracking systems to establish the recent path of the target and use that path to predict what the target's likely position would be in the immediate future. This information would then be fed to the gun-moving mechanisms so that aiming-off was continually updated.

The system had its most outstanding successes in 1944; when British and American gunners shot down German flying bombs with fewer than one hundred rounds per hit. This was an extraordinary advance over previous performance, estimated at one hit per twenty-five hundred rounds. In 1944, during the last four weeks of German V-1 missile attacks on England, the success rate improved dramatically. In-the first week, 24 percent of targets were destroyed; in the second, 46 percent; in the third, 67 percent; and in the fourth, 79 percent. The last day on which a large number of V-1s were launched at Britain, 104 of the missiles were detected by early-warning radar, but only four reached London. Antiaircraft artillery destroyed sixty-eight of them.

Early in his work on the artillery project Wiener had frequent discussions with a young physiologist named Arturo Rosenbleuth, who was interested in human feedback mechanisms that act to ensure precision in bodily movement. For the previous fifteen years Rosenbleuth had worked closely with Walter Cannon, professor of physiology at Harvard. Earlier in the century Cannon had invented the barium meal, which was opaque to X-rays. When ingested by a goose th e barium revealed the peristaltic waves that occurred in the bird's stomach when it was hungry. Cannon observed that hunger seemed to precipitate the onset of these waves. He then observed that when a hungry animal was frightened the waves stopped.

This led to Cannon's ground-breaking studies, of the physical effects of emotion. He discovered that when an animal was disturbed its sympathetic nervous system secreted into the bloodstream a chemical that Cannon named "sympathin." This chemical counteracted the effects of the disturbance and returned the animal's body systems to a state of balance. Cannon named the balancing process "homeostasis." In 1915 Cannon discovered that the principal body changes affected by the sympathetic system were those involved in fight, sexual activity or flight. In such situations sugar flowed from the liver to provide emergency energy and blood shifted from the abdomen to the heart, lungs and limbs. If the body were wounded, blood clotting occurred more rapidly than usual. In 1932 Cannon published a full-scale account of his research titled The Wisdom of the Body.

What had initially triggered cannon's interest in homeostatic mechanisms was the work of the man to whom Cannon dedicated the French edition of his book. He was an unprepossessing but eminent French physiologist named Claude Bernard, who had started his working life as a pharmacist's assistant in Beaujolais, where his father owned a small vineyard. After being forced to give up his early schooling for lack of funds, Bernard took up writing plays. He produced first a comedy and then a five-act play, which he took to Paris in 1834 with the intention of making a career in the theater. Fortunately for t he future health of humankind Bernard was introduced to an eminent theatrical critic, Saint-Marc Girardin, who read the play and advised Bernard to take up medicine.

At first Bernard planned to be a surgeon, but becoming dissatisfied with the general lack of physiological data he began to gather his own data by experimenting on animals. By 1839 his dexterity in dissection had brought him to the attention of the great physiologist François Magendie, who appointed him as assistant. One winter morning, in 1846 some rabbits were brought to Magendie's lab for dissection and Bernard noticed that their urine was clear and acidic. As every nineteenth-century French winemaker knew, the urine of rabbits is usually turbid and alkaline. Bernard realized that the rabbits had not been fed and theorized that since the urine of carnivores is clear, the hungry, herbivorous rabbits must have been living on their fat. When he fed grass to the rabbits their urine returned to its normal alkaline turbidity. He double-checked with an experiment on himself. After twenty-four hours subsisting only on potatoes, cauliflower, carrots, green peas, salad, and fruit, Bernard's own urine went turbid and alkaline. Bernard then starved the rabbits, fed them boiled beef and dissected them to find out what had happened. He saw a milklike substance (he took it to be emulsified fat) that had formed at the point where the rabbit's pancreatic juice was pouring into the stomach: There was clearly some link between the juice and the emulsification of the fats.

Two years later he discovered the glycogenic function of the liver, which injects glucose into the blood. It was this discovery thai led to Bernard's greatest contribution to the sum of human knowledge, because he saw that the function of the liver and the pancreas (and perhaps other systems, too) was to maintain the body's equilibrium. He summed up his research: "All the vital mechanisms; however varied they may be, have only one object, that of preserving constant the conditions of life in the inner environment." Follow-up research on the pancreas led an English researcher, William Bayliss, to coin the phrase that Cannon would use as his book title: "the wisdom of the body."

Not everybody was happy with Bernard's work, especially when he designed an oven in which to cook animals alive. An American doctor, Francis Donaldson, who attended Bernard's lectures in 1851, wrote: "It was curious to see walking about the amphitheater of the College of France dogs and rabbits, unconscious contributors to science, with five or six orifices in their bodies from which at a moment's warning, there could be produced any secretion of the body, including that of the several salivary glands, the stomach, the liver, and the pancreas."

Bernard was well aware of public opposition to vivisection but defended it: "The science of life is like a superb salon resplendent with light which one can enter only through a long and ghastly kitchen." Alas, Bernard's wife was unable to take the heat. After leaving him in 1869 she went in search of the antivivisection activists to whom she had been sending regular contributions.

She did not have far to go. In Paris a fanatical young vegetarian Englishwoman named Anna Kingsford, the owner of the Lady's Own Paper, had come to France to study medicine. Kingsford became well-known at the medical school for refusing to let her professors vivisect during the lessons she attended and for demonstrating against the practice. Kingsford's lecture halls were close to Bernard's labs, and she became so obsessed by his work that she set about directing all her energies toward killing him with thought waves. Bernard died only few weeks after she had begun to concentrate her mental energies on him, convincing her that she had been the instrument of divine will. Kingsford also claimed to have been responsible for the death of another vivisector, Paul Bert. However her efforts to do the same to Louis Pasteur failed.

Legislation to protect animals from ill treatment took a long time to reach the statute books, even in England, where the first such laws were passed. In 1800 the first bill to outlaw bull-baiting had ignominiously failed in its passage though the Houses of Parliament, opposed by George Canning (later prime minister), who claimed that bull-baiting "inspired courage and produced a nobleness of sentiment and elevation of mind....Putting a stop to bull-baiting was legislating against the spirit and genius of almost every country and age." However, in 1821 Dick Martin, MP for Galway, forced through a bill to protect horses and cattle against ill treatment. It was the first law of its kind in any country. In 1824 the Society for the Prevention of Cruelty to Animals was formed at the unfortunately named Old Slaughter Coffee House in London. The publication in 1859 of Darwin's Origin of Species seemed to strengthen the relationship between humans and animals and support the animal-defense argument. In 1876 the Victoria Street Society against Vivisection was formed with Lord Shaftesbury as chairman. The same year a bill was passed to prevent the vivisection of dogs, cats, mules, horses and asses. By the late nineteenth century the animal-defense movement had spread throughout the Western world and given birth to hundreds of local groups known as Humane Societies, in spite of the fact that the name more properly belonged to earlier humanitarian work of an entirely different nature.

The Royal Humane Society was founded in London in 1774 largely as the result of the efforts of Dr. William Hawes to promote knowledge of artificial-respiration techniques. Hawes based his ideas on the translation of a paper by the Amsterdam Society for the Recovery of the Apparently Drowned. The society had been founded in 1767 after several cases of successful resuscitation had been reported in Switzerland. In the nineteenth century interest in drowning became acute with the spectacular increase in cargo tonnage and passenger traffic on the high seas following the spread of industrialization. As the number of ships rose so did the number of shipwrecks and deaths.

From time to time the Royal Humane Society awarded a gold medallion for outstanding feats of bravery, and in 1838 the recipient hit the front pages because she was a slightly built, twenty-two-year-old woman. On the night of February 6, a paddle steamer, the Forfarshire, battling through a gale en route from Hull to Dundee with a full cargo and sixty-three passengers, sprang a leak in her boiler. The captain decided to take shelter among the Farne Islands off the coast of Northumberland. During this maneuver the ship hit the rocks and broke in two, and all but thirteen passengers and crew were drowned. The survivors, exposed to the full force of the storm, included a mother and two childr en. Overnight the two children and an adult died. At five o'clock the next morning Grace Darling, daughter of the local lighthouse keeper, caught sight of the wreck and the survivors clinging to the rocks. Grace and her father rowed to the rescue, struggling through mountainous sea in a small open boat. The drama was reported in the newspapers and Grace became an instant national hero. Alas, she was to die four years later from tuberculosis. Meantime she had inspired the public to offer massive financial and political support for the eventual establishment of the Royal National Lifeboat Institution, in 1854.

That same year came another highly publicized loss at sea. The USS San Francisco, an American troopship carrying hundreds of soldiers, foundered in an Atlantic hurricane. The secretary of the navy sent for Matthew Maury, the only man in America who would be able to tell where to look for survivors. After studying his wind and current charts, Maury pinpointed the spot and the survivors were found in the water.

Maury was the fourth son of a Huguenot-English family long settled in Virginia (his grandfather had taught Thomas Jefferson), and he had joined the U.S. Navy in 1825. It was during a voyage to South America that Maury became interested in finding faster ways to cross the ocean. On his return in 1834 he took leave and wrote his first work on navigation. In 1839 Maury published a series of articles in the Southern Literary Messenger, one of which advocated the establishment of a naval school. It would become the U.S. Naval Academy at Annapolis.

In 1847 Maury issued the first of several charts and then, in 1851, Explanations and Sailing Directions to Accompany the Wind a nd Current Charts. At the instigation of the U.S. government, copies of the charts and Sailing Directions were distributed free to all masters of vessels on the understanding that they would keep a full log of journeys and forward these logs to Maury, in Washington. Logs were to include temperature of air and water, direction of wind and currents, and air pressure. Captains were also required to throw overboard (at given intervals) a bottle containing a piece of paper carrying the ship's position and the date. They were also to pick up any such bottles they came across and note all details in their logs. In return for these services masters would receive free copies of Maury's further work. Over eight years Maury collected and processed data on many millions of observations, as a result of which he was able to identify faster sailing routes. One ship's master following Maury's suggested route from New York to Rio de Janeiro halved the usual journey time. It was reckoned that Maury's "Path-of-Minimum-Time" routes saved American shipping forty million dollars a year.

In 1853 Maury crowned his career when he persuaded sixteen countries (among them the United States, Britain, Belgium, Holland, Russia, France, Norway, Denmark and Portugal) to meet in Brussels for the first International Meteorological Congress "to plan an uniform system of meteorological observation at sea, and to agree a plan for the observation of the winds and currents of the oceans with a view to improving navigation and to enrich our knowledge of the laws which govern those elements." Not long after he had returned from Brussels, Maury received a letter from a retired paper-manufacturing millionaire named Cyrus W. Field , who was seeking advice on the ideal route for a transatlantic submarine telegraph cable.

Submarine cables had already been laid successfully in the relatively shallow waters between England and Holland, Scotland and Ireland, but the Atlantic represented a formidable challenge. Field had managed to get a favorable charter from the British government for a fifty-year monopoly on any cable laid between Newfoundland and Ireland. The British also offered to provide a cable-laying ship as well as a generous advance on income from telegraph messages. Field then spent two years laying a cable between Newfoundland and the North American mainland (stockholders in the company included such luminaries as Lady Byron and Thackeray). When the link was completed Field wrote to Maury to solicit his views on the best route out of Newfoundland toward Europe.

Maury reported that soundings revealed a shallow "telegraph plateau" across much of the North Atlantic, and in 1857 work began on laying the cable. After a few hundred miles had been laid the cable snapped. Three more attempts were made and on August 5, 1858, 1,850 miles of copper wire connected Valencia, Ireland, with Trinity Bay, Newfoundland; and traffic began with an inaugural message from Queen Victoria to President Buchanan. At the celebration dinner in New York Field said modestly: "Maury furnished the brains, England gave the money, and I did the work." Then the cable failed again. In 1865 they found the parted ends, spliced them and the work was done. The U.S. Congress voted Field a gold medal.

Field had also written to the man whose work had inspired the whole venture: Samuel Morse, inventor of the most successful form of telegraph. Morse's advan tages over other telegraphers were his key and the Morse Code, which he demonstrated before Congress in 1844. The idea had come to him in the autumn of 1832 during a voyage back to the United States from France. Morse first learned what he needed to know about the principles of electricity and one of his friends, Alfred Vail, provided the finance and hardware (Vail's father had a machine shop in New Jersey). Vail also suggested what would later become known as the Morse Code.

At this time Morse was a well-known artist, professor of art at New York University, and had just spent three years in Europe studying and painting. Morse was a strange man given to apocalyptic patriotic views. He had been brought up as a strict Calvinist by his father Jedidiah, America's foremost geography scholar, who had earlier led the Old Calvinist "Great Awakening" crusade against liberal theology. Like his father, Morse looked forward to the triumph of American culture and believed that only an elite could lead the country to salvation. Morse was also extremely xenophobic. At one point he painted a picture of the pope conspiring to arm American Catholics, provoke disorder, rig elections and elect foreigners to public office. Morse also helped publish a book about Maria Monk, a woman who claimed to have been a nun in Montreal, where she also claimed to have witnessed unnatural sexual acts performed by clergy and to have seen crypts filled with the corpses of illegitimate children. In the end it was revealed that Monk (rumored to have had a romantic affair with Morse) had escaped from a mental institution.

Morse believed that art was a tool placed in his hands by God to be used to save Protestant America. He believed t hat the millennium was imminent, and that when it came America would carry the empire of peace to the world. It was therefore essential to prepare American art for the great day. Morse founded the National Academy of Arts and Design in 1826 and was its president until 1845. The aim of the academy was to foster American artistic talent so that American genius could take its rightful place in the world and inculcate true Protestant virtues in other Americans.

In 1829 Morse decided to visit Europe to study artistic masterworks in preparation for what he hoped would be his greatest triumph, the commission to paint the four remaining murals for the Rotunda of the Capitol Building in Washington D.C. To this end, while in Paris in 1831, he painted the giant Gallery of the Louvre. The painting reproduced in miniature thirty-eight Louvre masterpieces. Morse's aim was to show that while the classical past was worthy of study it should not be the subject of slavish emulation by American artists, who, like the artist shown in the Louvre painting (Morse himself), could learn from the Old Masters and then develop their own distinctively American style. On his return the Louvre painting was put on exhibition in New York and was a disastrous flop. The commission for the Rotunda murals went to other artists. Morse turned to the telegraph as an alternative tool with which to make Protestant America great. Communications technology would be an instrument of Divine Will, redeeming America by transmitting messages of peace and love. At his demonstration in Congress in 1844 Morse's first transmitted message echoed these beliefs: "What hath God wrought!"

Morse had learned his art at the feet of Wa shington Allston, the most completely Romantic American painter, whom he had met in Boston in 1810 and with whom he became life-long friends. Only a year after their meeting Allston inspired Morse to attempt the first of his grand historical American scenes, Landing of the Pilgrims at Plymouth. That same year Morse joined Allston and his wife on his first trip to Europe. Allston was a good-looking Harvard-educated gentleman from South Carolina who on the death of his step-father in 1801 had sold the family property to finance a career in painting. On his earlier visit to London Allston had studied with Benjamin West, the president of the Royal Academy, and then in 1804 moved on via Paris to Rome. There he met Washington Irving, who later wrote: "I do not think I have ever been more completely captivated on a first acquaintance. He was of a light and graceful form, with large blue eyes, and black, silken hair waving and curling around a pale, expressive countenance. A young man's intimacy took place immediately between us, and we were much together during my brief sojourn at Rome....We visited together some of the finest collections of paintings, and he taught me how to visit them to the most advantage, guiding me always to the masterpieces, and passing by the others without notice." Allston's Italian Landscape shows the profound effect of Italy on his work. His fresh New England eye was overwhelmed by the light, the color, the ancient ruins, the landscape dotted with hilltop villages, the rich mingling of Renaissance, medieval and classical architecture and the pastoral nature of Italian peasant life.

In 1805 Allston met and painted the English Romantic poet Samuel Taylor Coleridge, whom he would later recognize as his greatest intellectual mentor. At the time of their meeting Coleridge was suffering the aftereffects of his failure to give up opium. Aged thirty-three, with "Kubla Khan" and the "Ancient Mariner" poems behind him, Coleridge was already famous. He was also an alcoholic, deeply in debt, unhappily married with three children, had failed in a venture to set up a utopian settlement on the banks of the Susquehanna in Pennsylvania, and was extremely hypochondriac (he coined the word "psychosomatic").

It was partly to try to wean himself from opium (and his penchant for taking it in brandy), and partly to get away from his wife, that in 1804 Coleridge had run away to Malta. There, thanks to an influential acquaintance, he landed the job of secretary to the British civil commissioner, Alexander Ball. The post also included free food and accommodation in the commissioner's palace in Valetta, the island's capital. Coleridge's workload was light and consisted principally of rewriting Ball's dispatches to London. Although Coleridge complained incessantly about his health, the nightmare of withdrawal symptoms, the dull company and his inability to write new poems, he enjoyed the climate and the countryside and managed to produce some of his best prose. He also began to feel the first stirrings of mortality: "I had felt the Truth; but never saw it before clearly; it came upon me at Malta, under the melancholy dreadful feeling of finding myself to be Man, by a distinct division from Boyhood, Youth, 'Young Man.' Dreadful was the feeling -- before that, life had flown on so that I had always been a Boy, as it were -- and this sensation had blended in all my conduct." When his fr iends William and Mary Wordsworth saw him on his return to England, they were to remark that he had changed for the worse.

Coleridge saw from the dispatches he was editing that he had arrived in Malta at a critical time. Ball was arguing the strategic importance of the island now that Napoleon had given up Louisiana, lost Santo Domingo and would inevitably turn his attention to the Mediterranean. Ball also suggested to the British government that Algiers, Tunis and Tripoli were ripe for colonization and "they are capable of growing all Our colonial produce." He also argued that though both Russia and France wanted Malta they should not be allowed to take it. At the time, the island was a hotbed of intrigue. The Maltese were agitating for independence, Russian and French spies were imagined to be everywhere and there was an American naval squadron on station commanded by Commodore Edward Preble. Among Preble's officers was the young Stephen Decatur, leader of the daring and successful 1804 raid on Tripoli harbor to destroy the American frigate Philadelphia, which had run aground and been captured during the American-Tripolitanian War. During a brief trip to Sicily Coleridge met and dined with both intrepid Americans, and for years later regaled friends with tales of their exploits.

Coleridge's employer, Rear Admiral Alexander Ball, had joined the British Navy at the age of twelve. This event, he told Coleridge, had been inspired by reading Robinson Crusoe. Ball had the air more of an academic than a sailor, bookish and thoughtful. After serving in the Caribbean, America and Newfoundland, in 1783 he took a year off and went to France to study the language. At one point there, during a visit to St. Omer, he met another young captain with whom his fate was to be bound up, in spite of the fact that on this occasion each expected the other to make the required formal call, so neither did so. Ball then served in the English Channel, went again to Newfoundland, was stationed off the French Coast and in 1798 was posted to the Mediterranean where he was to meet the young captain with whom he had failed to exchange courtesies in St. Omer. At this time Britain was expecting to be invaded by Napoleon, and much of the British fleet was patrolling outside French harbors in the English Channel and on the French Atlantic coast. Hearing a rumor that Napoleon was assembling a Mediterranean fleet in Toulon, the British also sent a fleet to blockade that port.

In April the Toulon blockade fleet fleet was joined by a small squadron under the command of the man Ball had met in France, Captain (by now Admiral) Horatio Nelson, the fastest-rising star in the British Navy. No sooner had Nelson's squadron arrived off Toulon than it was blown south and scattered by a ferocious gale. Off the coast of Sardinia Nelson's flagship lost its mainmast as well as much of its rigging and was being driven by mountainous seas toward a rocky coast. At the last minute Ball brought his own ship alongside, took Nelson's flagship in tow in spite of Nelson's orders to the contrary and saved the day. The captain of the flagship later reported that Ball used his Speaking trumpet to call "with great solemnity and without the least disturbance of temper...'I feel confident that I can bring you in safe. I therefore must not, and by the help of Almighty God will not leave you.'" After the rescue Ball was mentioned in dispatch es and the two men remained close friends for the rest of Nelson's brief life.

Meanwhile Napoleon's fleet had taken advantage of the situation to slip out of Toulon and head for Egypt. On the way there Napoleon sent a detachment to capture Malta. Ball was now dispatched (with Nelson's influential approval) to retake the island. After a two-year siege he did so and was appointed commissioner. Nelson himself caught up with Napoleon at the Battle of the Nile, where he defeated the French, and then sailed back to Naples for repairs. In Naples Nelson would fall in love for the fifth time (after the daughter of Quebec's provost marshal, a clergyman's daughter, the wife of the commissioner of Antigua, and Fanny Nisbet, niece of the president of the St. Nevis Council). In 1787, Fanny had become Nelson's wife, to the chagrin of his colleagues, one of whom remarked that Fanny had two remarkable attributes: her good complexion and a "remarkable absence of intellectual endowment."

Ten years after the marriage the heroic Nelson arrived in Naples. He was Europe's unlikeliest heart-throb: he was thirty-eight, short, plump, white-haired, with a squeaky voice and Norfolk accent, blind in the right eye and with a stump for a left arm. He was in the habit of introducing himself: "I am Lord Nelson and this [gesturing to his good arm] is his fin."

The object of Nelson's Neapolitan infatuation (and later; while her husband was still alive, his mistress and later still mother of his two children) was a thirty-three-year-old married Englishwoman with a hidden past and the habit of wearing no underwear. Lady Emma Hamilton was the wife of Sir William Hamilton, the sixty-seven-year-old English minister to the Court of N aples. In 1785 Hamilton had, so to speak, taken Emma over from his impecunious nephew Greville to clear the decks for the latter's impending marriage into a rich family. Emma was not told of the arrangement, merely that she would be living in Naples for six months until Greville could return for her. Nine months later, realizing that Greville was not coming, Emma gave in to the widower Hamilton and they became lovers.

To please Emma, Hamilton arranged singing and music lessons, trips to the newly excavated ruins of Pompeii and Herculaneum, rides up Vesuvius and a series of conversazioni at which Emma was introduced to the local nobility and the Neapolitan royal family. Soon she had become famous for her "entertainments," in which she posed, diapahanously dressed, in various classical tableaux: Agrippina scattering the ashes of Germanicus, Orestes sacrificing his sister, Oedipus blinded, and (very popular) the Bacchante "surprised while bathing." In 1791 Hamilton returned briefly to England to marry Emma. The couple then returned to Naples where Hamilton continued "collecting" antiques from archeological sites to sell in London, where his Greek and Roman vases would inspire the potter Josiah Wedgwood and help kick off the Neoclassical movement.

In the light of Emma's past, her marriage to Hamilton made quite a furor. Emma had begun life as Emma Lyon, daughter of a humble smith. Brought up in Wales, at the age of twelve she was already employed as a nursemaid. A year later Emma was in London, maid to Mrs. Kelly, a well-known madame, and soon thereafter became one of Mrs. Kelly's "girls." By the age of sixteen she was living with a "protector," Harry Featherstonehaugh, who then passed her on to William Hamilton's nephew, Greville.

In London Emma was also rumored to have been employed as a "maiden" at Dr. James Graham's wildly fashionable Temple of Health, where patrons, including the duchess of Devonshire, were given electric shocks and served by transparently dressed "attendants." Graham's Temple was an elaborately decorated Adam house in The Adelphi, London. The fencing master to George IV later recalled "carriages drawing up next to the door of this modern Paphos, with crowds of gaping sparks on either side, to discover who were the visitors, but the ladies' faces were covered, all going incognito. At the door stood two gigantic porters, with each a long staff, with ornamental silver head, like those borne by parish beadles, and wearing superb liveries, with large, gold-laced cocked hats, each was near seven feet in height, and retained to keep the entrance clear." Entering under an enormous gold star, Graham's clients found themselves in lavishly decorated rooms with stained-glass windows. Music played and perfume drifted on the air. In these salubrious surroundings the elite were treated with medications including Nervous Aetherial Balsam, Electrical Aether and Imperial Pills. The star of the show had been made for Graham by an expert tinsmith. It was the Magnetico-Electrico Celestial Bed, on which childless couples would receive electrical shocks while coupling. The shocks were said to ensure immediate conception.

Graham may have derived his interest in electricity from conversations with Benjamin Franklin in Paris in 1779 when he met Franklin during the latter's time as U.S. minister to France. Electricity was the subject of much speculative experimentation at the time. In 1 720 the Englishman Stephen Grey had electrified a small, suspended boy. In 1743 Johann Kruger, professor at Helmstadt University, had suggested that passing an "effluvium" of electricity through the body might be good for the health. Christian Ratzenstein claimed electric shocks raised the pulse rate and increased the circulation of the blood. Samuel Quellmaltz said he had used electricity to cure paralysis of the hand well enough for his patient to play the klavier. Even such respectable persons as John Wesley recommended electric treatment for nervous disorders. In 1777 an electrical machine was ordered for St Bartholemew's Hospital in London. Graham is often described as a quack, but in an age when much medicine was still guesswork and mumbo-jumbo, he may have been no worse than anybody else. Besides, he had received his training at Edinburgh University, site of the best medical school in Britain, where he attended lectures by the great Joseph Black.

Black was a high-flier who had made his international reputation by the age of twenty-seven, when in 1755 he published a paper on an experiment in which he had heated limestone and caused it to become caustic quicklime. The value of this particular bit of research was that all contemporary treatments for kidney stone, common at the time, involved caustics. Up to then it had been thought that quicklime acquired its causticity from the fire. Black proved otherwise and changed the course of chemistry. He found that quicklime was made when a gas in the limestone was driven out by the heat, and that this gas could be recombined with quicklime to form limestone once again. Moreover, this combining and recombining process could be continued without limit. Each time the volume and weight of the relevant components were exactly the same.

The other world-changing discovery by Black came as a result of his investigation into the process of distillation. The Scottish whisky manufacturers' market had expanded rapidly since the union of Scotland with England in 1707, and distillers were keen to find ways of getting more whisky for less cost, so Black was concentrating his researches on finding ways to save fuel. His experiments on the amount of heat required to boil off liquids revealed the existence of latent heat, which explained the extremely high temperature of steam and why the distillers needed such copious quantities of cold water to condense it.

The latent heat discovery also showed James Watt (who worked at the University of Glasgow when Black was teaching there) why the Newcomen steam-driven pump was so inefficient. In the pump (one of which Watt was repairing at the time) steam entered a cylinder kept ice-cold with a water jacket. This caused the steam entering the cylinder immediately to condense, creating a partial vacuum and allowing air pressure to force the cylinder piston down. The piston rod was attached to one end of a pivoted beam set above the cylinder. As the piston moved down the other end of the beam moved up, lifting a rod attached to a suction pump. The problem was that the high temperature of the steam was heating the cylinder too much and weakening the subsequent condensing process on each stroke until the cylinder became so hot that no condensation would take place and the pumping action would stop. Black showed Watt that the cylinder would have to be linked to a separate, chilled condensing chamber (immersed in cold water) so that the scaldingly hot steam could condense there and not heat the cylinder while doing so. This separate condenser was the secret of Watt's success.

Through his association with Black in 1769 Watt was able to carry out some of his key experiments at Kinneil House, outside Edinburgh. This was the ducal seat of the Hamilton family and leased by Dr. John Roebuck; a successful entrepreneur and ex-pupil of Black's. Roebuck owned a coal mine that supplied his Carron ironworks, and since the coal mine was subject to flooding, his hope was that a successful Watt pump might save it. In the event the mine flooded early and drove Roebuck to bankruptcy, but not before he had helped finance Watt's research by paying off his debts in exchange for a percentage of Watt's patent rights on the steam pump. When bankruptcy intervened in 1772 Roebuck sold his share of Watt's patent to Matthew Boulton, a shoe-buckle manufacturer in Birmingham, and Watt finally met the partner he needed. Together Boulton and Watt turned the steam pump into the engine of the Industrial Revolution.

Meanwhile, Roebuck had already made his own contribution to industry with a new process for the manufacture of sulphuric acid. Not long after graduation he had invented an improvement in refining precious metals. This process used sulphuric acid, and in 1749 Roebuck set up a new sulphuric acid manufacturing facility at Prestonpans, near Edinburgh. The earlier manufacturing technique had involved burning sulphur and niter over water and condensing the acid from the fumes in glass globes. Roebuck replaced the glass globes with small lead chambers and quartered the manufacturing costs.


The market for sulphuric acid was growing steadi ly as the textile industry became more mechanized. By 1760 John Kay's flying shuttle was in general use, doubling the output of weft threads. Nearly ten years later James Hargreaves's spinning jenny multiplied the number of weft thread spindles that could be worked by one spinner. Richard Arkwright's 1769 water Dame drew out the weft thread on mechanically rotating cylinders, and in 1779 Samuel Crompton's mule combined the jenny and the Dame to complete the mechanization of the entire process. The mule produced thread fine enough for the best muslin cottons. By this time the market for cotton was booming, quadrupling raw cotton imports between 1791 and 1800.

The growth of cotton manufacture triggered a consequent rise in the demand for bleach. Before Roebuck's new system traditional bleaching of cloth (to rid it of its natural gray-yellow color) had been done in bleachfields. Between March and September the cloth to be bleached was stretched out, doused with fermented milk and left for six weeks to whiten. Roebuck's cheap, diluted sulphuric acid would do the same job in twenty-four hours. In 1785 the French chemist C. L. Berthollet had discovered that chlorine gas was a powerful bleaching agent, and James Watt introduced its use in Scotland. Cloth to be bleached was hung in gas-filled rooms, where bleachers sometimes died from the effects of breathing the gas. Then in 1799 Charles Tennant passed the gas over slaked lime and produced the first safe, cheap bleaching powder.

As an almost immediate result white paper became common. Before this, as can be seen from the gray tinge of English paper and the muddy color of early American documents, the color of paper depended on the rags from which it wa s made. Paper was made by pounding rags to a pulp, leaving them in water to ferment, pounding them again, then straining out the water on a vibrating wire mesh belt, and finally drying it by rolling the pulp between felt cloths and heated rollers. The first version of this process was invented by a Frenchman, Louis Robert, in 1799. The width of the paper made by his machine was that required for wallpaper, the fastest-growing furnishing accessory in Europe at the time. The Paris Journal des Inventions reported: "For the view, the cleanliness, the freshness and the elegance, these papers are preferable to the rich materials of the past; they do not allow any access to insects, and when they are varnished, they retain all the vivacity and charm of their colours for a long time. Finally, they can be changed very frequently...making us thus inclined to renovate our homes, cleaning them more often and making them gayer and more attractive."

When Robert's paper-making venture failed for lack of financial support in France he sold the patent to his erstwhile employer, Didot Leger. Didot's English brother-in-law John Gamble then took it to Britain, where the paper-making brothers Fourdrinier set up the first fully operational version of Robert's process, at their mill in Frog-more, near London, in 1808. In 1836, when the British government repealed the high wallpaper tax, mass production began. In 1839 Harold Potter of the Darwen wallpaper factory perfected a power-driven roller printer. By 1850 machines were able to print perfectly registered patterns in eight colors on fifty-four thousand feet of paper a day. The effect on the wallpaper industry between 1834 and 1860 was to increase output from a million to nine million yards. Prices dropped like a stone. What had been a luxury item was now available to all but the very poor.

An Englishman named William Morris used the new manufacturing and printing techniques to put wallpaper into the homes of the industrial middle class for the first time. Morris was a well-off Oxford graduate influenced like others of his age by the social conditions of industrial-age Britain. In the mid nineteenth century government surveys were beginning to reveal the scale of social problems that had been created by the rapid industrialization of the previous decades and exacerbated by overcrowded living conditions and the widening gulf between the rich factory owners and their disfranchised, poverty-stricken workers. Morris and his friends turned away from the horrors of the cities to medieval art and architecture. For them the Middle Ages represented an age of innocence when the craftsman had been an independent, creative spirit, free to take his skills wherever he chose, protected from exploitation by professional guilds.

Morris led the new Arts and Crafts movement dedicated to bringing this medieval sweetness and light to urban homes. From 1877 his company showrooms in Oxford Street displayed "traditional" furniture, tapestries and wallpaper with simple floral patterns based on medieval designs. The style revolutionized public taste. One social commentator Wrote: "It may be questioned whether the decorative treatment of the wails should give place to pictures in rooms which are occupied from day to day. If we imagine the tired man of business returning to his suburban home...it can hardly be supposed that he will be in a position to make the special mental eff ort involved in inspecting his pictures; but supposing him to be the happy possessor of a harmoniously decorated room, he will at once be soothed and charmed by its very atmosphere."

Morris took his artistic views into his political life. The utopian spirit of his art was mirrored by socialist beliefs that drove him to what he called "holy warfare" against capitalism. From 1877 he gave a series of lectures to working men in which he attacked the values of Victorian society. In 1883 he joined the Democratic Federation, inspired by Marx's views on the alienation of industrial workers. Morris sold the federation's weekly paper, Justice, on the streets, and joined, its executive committee together with Marx's daughter, Eleanor. In 1884 he broke with the federation (when it proposed to become an orthodox political party) and set up the Socialist League. When this in turn was infiltrated by anarchists he broke away again and founded his own Socialist Society in Hammersmith, London.

At the Society's musical evenings, where socialist songs were sung under the direction of composer Gustav Hoist, two society members played piano duets. One of them was a twenty-seven-year-old would-be journalist named George Bernard Shaw, his ragged cuffs trimmed with scissors, wearing shabby, cracked boots and an ancient coat and sporting a red beard and an Irish brogue. The other, one of Morris's ex-colleagues on the Democratic Federation executive, was the equally charismatic Annie Besant.


Besant was by this time a well-known activist, having been involved fifteen years earlier in one of the most widely publicized trials of the nineteenth century. In 1877 she and the national Secular Society's Charles Bradl augh had been sentenced to six months in prison and a fine for republishing a forty-year-old pamphlet by an American author, Charles Knowlton, titled "Fruits of Philosophy." The pamphlet contained detailed instructions on contraception for young married couples. During the trial (for obscene publication), Besant and Bradlaugh spoke out eloquently about the new threat of overpopulation that would result from slowly improving living conditions and sanitation; on the overcrowded conditions in slums, rife with immorality and incest; on the effects of pauperism that led to a death rate of one in three infants; and on the need for freedom from prosecution for those publicizing the facts about contraception. The sentence passed on Besant and Bradlaugh was quashed by the judge before the two defendants had left the dock. Besant had become the first woman ever to speak out publicly to advocate contraception and get away with it.

In 1889 Besant became a Theosophist, and by 1891 she was effectively running the Theosophist Society. Theosophists rejected material things, encouraged vegetarianism, sought to bring the universal brotherhood of all races, investigated latent psychic powers and studied ancient and modern religions and philosophies. In pursuit of all these aims, in 1893 Besant visited India and set up the Central Hindu College for the Study of Comparative Religion in Benares. Earlier that same year she had visited the Chicago Exposition and together with Indian Theosophists had participated in meetings of the "Parliament of Religions." They took the meetings by storm, with four thousand attendees turning up at the last session to hear them. The Theosophists' vegetarian message was greeted sympatheti cally in the United States. The first vegetarian society had been formed in Philadelphia in 1850. In 1858 Dr. Caleb Jackson founded a health center at Danville, New York, based on vegetarian principles and including cold water treatments.


In 1865 Danville was visited by a Seventh Day Adventist named Ellen White. Two years earlier she had had a vision in which she was told to eat only two meals a day, avoid meats, cake, lard or spices, and consume only bread, fruits, vegetables and water. In Danville Mrs. White had a further vision. This time the message she received was to set up another Danville. A year later she and her Adventist colleagues bought a farm on seven acres of land just outside a small Michigan town and opened their health center.

The center's rules were strict: no levity, no playing checkers, lots of oatmeal pudding and religion and cold water cures. The diet excluded tea and tobacco. Soon after the center opened it was in financial difficulties. The Adventists looked around for a new superintendent. They chose a young man who lived in the same Michigan town and had at the age of fourteen begun typesetting for the Adventist printing house. The church elders sponsored him through a course at Bellevue Medical College in New York, and in 1875 he graduated and took over what was now known as the Western Health Reform Institute. The first thing he did was to rename it the Medical and Surgical Sanitarium. The new superintendent had a natural eye for publicity. He dressed entirely in white, seemed to need no sleep, and was seen frequently with a cockatoo perched on his shoulder. He fostered vegetarian diets and set up the Three-Quarter-of-a-Century Club (healthy eating promoted longe vity) and, in 1914, the Race Betterment Foundation. At the sanitarium he developed courses in nursing, physical education and home economics. For the patients he introduced room service, a gymnasium, a string orchestra in the dining room and wheelchair social events on the front lawn.

There was one aspect of the diet that still troubled him: "the half-cooked, pasty, dyspepsia-producing breakfast mush." To this end he began experiments in the sanitarium kitchen. Some time in 1894 he boiled and steamed wheat into a paste, flattened it between rollers, scraped the pieces emerging from the rollers and baked them crisp. In March 1895 at the General Conference of Adventists he presented his invention. It changed life in the Adventist community and then the world. The Superintendent of the Battle Creek Sanitarium was named John H. Kellogg, and his invention was named "cornflakes."

Copyright © 1999 by London Writers

Read More Show Less

Introduction

Introduction

Change comes so fast these days that the reaction of the average person recalls the depressive who takes some time off work and heads for the beach. A couple of days later his psychiatrist gets a postcard from him. The message on the card reads: "Having a wonderful time. Why?"

Innovation is so often surprising and unexpected because the process by which new ideas emerge is serendipitous and interactive. Even those directly involved may be unaware of the outcome of their work. How, for instance, could a nineteenth-century perfume-spray manufacturer and the chemist who discovered how to crack gasoline from oil have foreseen that their products would come together to create the carburetor? In the 1880s, without the accidental spillage of some of the recently invented artificial colorant onto a petri-dish culture that revealed to a German researcher named Ehrlich that the dye preferentially killed certain bacilli, would Ehrlich have become the first chemotherapist? If the Romantic movement's concept of "nature-philosophy" had not suggested that nature evolves through the reconciliation of opposing forces, would Oersted have sought to "reconcile" electricity and magnetism and discovered the electromagnetic force that made possible modern telecommunications?

Small wonder, then, that the man and woman in the street are left behind in all this, if the researchers themselves don't get the point. But given the conditions under which science and technology work, how else could it be? At last count there were more than twenty thousand different disciplines, each of them staffed by researchers straining to replace what they produced yesterday.

These noodlingwf taking things apart to see how they work.

Over millennia, this way of doing things has tended to subdivide knowledge into smaller and more specialist segments. For example, in the past hundred years or so, the ancient discipline of botany has fragmented and diversified to become biology, organic chemistry, histology, embryology, evolutionary biology, physiology, cytology, pathology, bacteriology, urology, ecology, population genetics and zoology.

There is no reason to suppose that this process of proliferation and fragmentation will lessen or cease. It is at the heart of what, since Darwin's time, has been called "progress." If we live today in the best of all possible materialist worlds, it is because of the tremendous strides made by specialist research that have given us everything from more absorbent diapers to linear accelerators. We in the technologically advanced nations are healthier, wealthier, more mobile, better-informed individuals than ever before in history, thanks to myriad specialists andú the products of their pencil-chewing efforts.

However, the corollary to a small minority knowing more and more about less and less is a large majority knowing less and less about more and more. In the past this has been a relatively unimportant matter principally because for most of history the illiterate majority (hard-pressed enough just to survive) has been unaware that the problem existed at all. Technology was in such limited supply that there was only enough to share it among a few elite decision-makers.

It is true that over time, as the technology diversified, knowledge slowly diffused outward into the community via information media such as the alphabet, paper, the printing press and telecommunications: But at the same time these systems also served to increase the overall amount of specialist knowledge. What reached the general public was usually either out-of-date or no longer vital to the interests of the elite. And as specialist knowledge expanded, so did the gulf between those who had information and those who did not.

Each time there was a major advance in the ability to generate, store or disseminate knowledge, it was followed by an "information surge" and with it a sudden acceleration in the level of innovation that dramatically enhanced the power of the elites. But sooner or later the same technology reached enough people to undermine the status quo. The arrival of paper in thirteenth-century Europe strengthened the hand of church and throve, but at the same time created a merchant class that would ultimately question their authority. The printing press gave Rome the means to enforce obedience and conformity, then Luther used it to wage a propaganda war that ended with the emergence of Protestantism. In the late nineteenth century, when military technology made possible conflicts in which hundreds of thousands died, and manufacturing technology generated untenable working and living conditions for millions of factory workers, radicals and reformers were aided in their efforts by new printing techniques cheap enough to spread their message of protest in newspapers and pamphlets.

By the mid twentieth century scientific and technological knowledge far outstripped the ability of most people, even the averagely well-informed, to comprehend it. The stimulus, of the Cold War brought advances in computer technology that seemed likely to place unprecedented power in the hands of economic and political power blocs. There was talk of "Big Brother" government, rule by multinational corporations, the central databases that would hold personal files on every individual, and the creeping homogenization of the human race into one giant "global village." Unchecked state and corporate industrialization finally began to generate the first visible signs of global warming, runaway pollution decimated the animal population and the tropical forests went down before fire and axe at an alarming rate.

However, at the same time, the failing cost of computer and telecommunications technology also began to make it possible for these developments to be discussed in an, unprecedentedly large public forum. And the more we learned about the world through television and radio, the more it became clear that urgent measures were needed to preserve its fragile ecosystems and its even more fragile cultural diversity. At the end of the twentieth century the emergence of the ubiquitous Internet and affordable wireless technology offered the opportunity for millions of individuals to think, of becoming involved.

However, the culture of scarcity, with which we have lived for millennia has not prepared us well for the responsibilities technology will force on us in the next few decades. Reductionism, representative democracy and the division of labor have tended to leave such matters in the hands of specialists who are, increasingly, no more aware of the ramifications of their work than anybody else.

The result is that national and international institutions are coming under unprecedented stress as they try to apply their obsolete mechanisms to twenty-first-century problems. In Britain recent ly a case was brought against an individual which rested on the fifteenth-century meaning of the word "obscene." Medical etiquette has changed little since 1800. In some places science and religion are in conflict over the definition of life.

Western institutions function as if the world had not changed since they were established to deal with the specific problems of the time. Fifteenth-century nation-states, emerging into a world without telecommunications, developed representative democracy; seventeenth-century explorers in need of financial backing invented the stock market; in the eleventh century the influx of Arab knowledge triggered the invention of universities to process the new data for student priests.

In the coming decades it is likely that many social institutions will attempt to adapt by becoming virtual, bringing their services directly to the individual much in the way that banks have already begun to. But their new accessibility will in turn likely subject them to proliferating and diversifying demands that will change how they work and make them redefine their purpose. In education, the old reductionist reliance on specialism and testing by repetition will have to give way to a much more flexible definition of ability. As machines increasingly take over the tasks that once occupied a human lifetime, specialist skills may take on a merely antiquarian value. New ways will have to be found to assess intelligence in a world in which memory and experience seem no longer of value (again, this is nothing new: the alphabet and later the printing press both presented the same perceived threat).

When a corporate workforce becomes scattered across the country, or the globe, in thousand s of individual homes or groups, and deals direct with millions of customers, the value of communication skills is likely to outweigh that of most others. Such ability may be possessed by people who would previously have been thought unqualified to work for the corporation, because in the old world they would have been too young, or too old, or too distant, for example. A virtual education system will have to deal with problems such as a multicultural global student body bringing very diverse experience, attitudes and aims to the class. In terms of international law, recent cases involving copyright or pornography reveal how complex such legal problems are likely to become.

This book does not attempt directly to address any of these problems. Rather, it suggests an approach to knowledge perhaps more attuned to the needs of the twenty-first century as described above. Some readers will no doubt see this approach as more evidence of the "dumbing-down" of recent years. But the same was said about the first printing press, newspapers, calculators and the removal of mandatory Latin from the curriculum.

In its fully developed form, the "webbed" knowledge system introduced here would be inclusive, not exclusive. Modern interactive networked communications systems married to astronomically large data storage capability ought to ensure that at times of change nothing need be lost. No subject or skill will be too arcane for its practitioners to pursue when the marketplace for their skills is planetwide.

Also, no external memory device from alphabet to laptop seems to have degraded human mental abilities by its introduction. Rather these abilities have been augmented each time by the new tools. Some skil ls, such as rote memory, become less widely used, but there seems to be no evidence that the capability for them disappears. In many cases machines also take over routine work, freeing individuals to use their skills at higher levels.

The latest interactive, semi-intelligent technologies seem likely to make this possible on an unprecedented scale. They also bring to an end a period of history in which the human brain was constrained by limited technology to operate in a less-than-optimal way, since the brain appears not to be designed to work best in the linear, discrete way promoted by reductionism. The average healthy brain has more than a hundred billion neurons, each communicating with others via thousands of dendrites. The number of potential ways for signals to go in the system is said to be greater than the number of atoms in the universe. In matters as fundamental as recognition it seems that the brain uses some of its massive interconnectedness to call on many different processes at once to deal with events in the outside world, so as quickly to identify a potentially dangerous pattern of inputs.

It is this pattern-recognition capability that might prove to be the most useful attribute of a webbed knowledge system driven by the semi-intelligent interactive systems now being developed. As this book hopes to show, learning to identify the pattern of connections between ideas, people and events is the first step toward understanding the context and relevance of information. So the social implications of webbed knowledge systems are exciting, since they will make it easier for the average citizen to become informed of the relative value of innovation. After all, it is not necessary to under stand the mathematics of radioactive decay to make a decision about where to site a nuclear power plant. As I hope you will see, this approach to knowledge may be one way to enfranchise those millions who lack what used to be called formal education and to move us toward more participatory forms of government.

I would not pretend that what follows is more than a first exercise, a number of linked storylines intended to introduce the reader to the kind of information infrastructures we may begin to use in the next few decades. But I hope they will introduce the reader to a new, more relevant way of looking at the world, because in one way or another, we're all connected.

James Burke
London 1999

Copyright © 1999 by London Writers

Read More Show Less

Customer Reviews

Be the first to write a review
( 0 )
Rating Distribution

5 Star

(0)

4 Star

(0)

3 Star

(0)

2 Star

(0)

1 Star

(0)

Your Rating:

Your Name: Create a Pen Name or

Barnes & Noble.com Review Rules

Our reader reviews allow you to share your comments on titles you liked, or didn't, with others. By submitting an online review, you are representing to Barnes & Noble.com that all information contained in your review is original and accurate in all respects, and that the submission of such content by you and the posting of such content by Barnes & Noble.com does not and will not violate the rights of any third party. Please follow the rules below to help ensure that your review can be posted.

Reviews by Our Customers Under the Age of 13

We highly value and respect everyone's opinion concerning the titles we offer. However, we cannot allow persons under the age of 13 to have accounts at BN.com or to post customer reviews. Please see our Terms of Use for more details.

What to exclude from your review:

Please do not write about reviews, commentary, or information posted on the product page. If you see any errors in the information on the product page, please send us an email.

Reviews should not contain any of the following:

  • - HTML tags, profanity, obscenities, vulgarities, or comments that defame anyone
  • - Time-sensitive information such as tour dates, signings, lectures, etc.
  • - Single-word reviews. Other people will read your review to discover why you liked or didn't like the title. Be descriptive.
  • - Comments focusing on the author or that may ruin the ending for others
  • - Phone numbers, addresses, URLs
  • - Pricing and availability information or alternative ordering information
  • - Advertisements or commercial solicitation

Reminder:

  • - By submitting a review, you grant to Barnes & Noble.com and its sublicensees the royalty-free, perpetual, irrevocable right and license to use the review in accordance with the Barnes & Noble.com Terms of Use.
  • - Barnes & Noble.com reserves the right not to post any review -- particularly those that do not follow the terms and conditions of these Rules. Barnes & Noble.com also reserves the right to remove any review at any time without notice.
  • - See Terms of Use for other conditions and disclaimers.
Search for Products You'd Like to Recommend

Recommend other products that relate to your review. Just search for them below and share!

Create a Pen Name

Your Pen Name is your unique identity on BN.com. It will appear on the reviews you write and other website activities. Your Pen Name cannot be edited, changed or deleted once submitted.

 
Your Pen Name can be any combination of alphanumeric characters (plus - and _), and must be at least two characters long.

Continue Anonymously

    If you find inappropriate content, please report it to Barnes & Noble
    Why is this product inappropriate?
    Comments (optional)