- Shopping Bag ( 0 items )
Today's digital technologies are rooted in the ability of high-speed computers to correct errors when returning binary data to the human sensorium. High-tech diagrams echo the visual structures of the Encyclopedia, arraying packets of dissimilar data across digital spaces instead of white paper. The culture of diagram broke with the certainties of eighteenth-century science to expand the range of human experience. Speaking across disciplines and discourses, Bender and Marrinan situate our modernity in a new and revealing light.
"This wide-ranging study boldly connects elements of visualization, fine art and literary analysis, science history, and virtual reality . . . An ambitious work that will likely fascinate faculty and students in various disciplines."—David R. Conn, Library Journal
"Displaying extraordinary erudition in many different fields, the authors performatively duplicate the method of rapports (correlations) across disciplines that they see as the special genius of diagrams themselves . . . The Culture of Diagram is a wonderfully stimulating, provocative, original achievement, lavishly illustrated with many examples that reinforce its argument."—Martin Jay, Nineteenth-Century French Studies
"Ranging from the 19th century to the present, Bender and Marrinan's topics go beyond, and often integrate, the sciences and the arts, as the authors read diagrammatic knowledge into prints and oil paintings and thread statistics and probability theory through the social sciences, human vision theory, and into modern physics . . . Recommended."—D. Topper, CHOICE
"The Culture of Diagram is a well-crafted book. Its originality resides in the authors' capacity to recognize a running theme from the eighteenth century to the present in the history of representation in different domains such as painting, illustration, literature, aesthetics, theater and science, and in establishing key connections between ideas . . . this book creates a renewed ground for reflection."—Norberto Serpente, British Journal of the History of Science
A surgeon enters the bright, even light of an operating room where the patient, prepared for surgery, occupies a table surrounded by the expected array of monitors, respirators, and sterilized tools. But rather than taking his usual place near the patient, the surgeon seats himself at a nearby console where an assistant places over his head a helmet that completely covers his eyes and most of his face. The helmet is plugged into a computer and the doctor grasps two wands shaped to resemble microsurgical scalpels. He signals to a technician at a computer terminal that he is ready: a delicate surgical procedure on one of the patient's eyes is about to begin.
What is happening here? Without ever physically touching the patient, nor even seeing him directly, the doctor is directing a delicate procedure inside the eye itself. The "helmet" he wears is a head-mounted display (HMD) that positions before his own eyes two small color television monitors connected to a stereo imaging device. What the doctor "sees" is a real-time, stereoscopic image of the movement and position of his microsurgical tools (FIGURE 1 / PLATE 1 and FIGURE 2 / PLATE 2). Moving his head changes the position of the miniature camera so that the doctor seems to be inside the eye itself, able to see at close range the movements and effects of the scalpel's actions.
Those movements are controlled by a sophisticated robot, which responds to the surgeon's manipulation of the scalpel-like wand held in each hand, but executed with a greater precision and stability than the doctor could ever perform by himself. The computer driving the robot automatically reduces his movements by a factor of 100, and it removes nearly all the physiological tremor of his hands. It also continuously monitors the surgeon's actions by comparing them to the structure of a mathematically defined virtual eye stored within its data banks, so that if he should try to proceed too quickly among delicate tissues of the actual eye, the computer will impede or correct the gesture. Sensors within the scalpel mechanism register the amount of resistance produced by its cutting action and pass the data to the controlling computer, where it is magnified by a factor of 100 and sent to the wands held by the doctor: in this way, the surgeon "feels" the effect of his actions within the patient's eyeball.
In a comparison probably inspired by the layered meanings of the German word Operateur [surgeon/projectionist], Walter Benjamin drew an analogy between the work of a surgeon and that of a cameraman. "The surgeon," he wrote, "greatly diminishes the distance between himself and the patient by penetrating into the patient's body ... at the decisive moment [he] abstains from facing the patient man to man; rather, it is through the operation that he penetrates into him." Benjamin is describing a familiar scenario, an early version of which is pictured in Denis Diderot and Jean Le Rond d'Alembert's Encyclopedia (FIGURE 3). He goes on to say that a cameraman's relationship to the world-like a surgeon's instrumentalized relationship to his patient-is both depersonalized by the cinematic apparatus and "penetrates deeply into its web." What Benjamin could not imagine in 1936 was that advances in microrobotics, electronic imaging, and computing power would transform his metaphor into reality: our ophthalmologist, who neither touches his patient nor sees him directly, is literally a surgeon-cameraman, completely immersed in a world of electronically produced images yet guiding a scalpel through the living tissue of his patient's eye.
The machinery of a modern operating room seems to challenge many of the everyday, commonsense notions that Benjamin took as givens: the integrity of a physical body; its opacity to others; and a rather uncomplicated relationship between what John Locke called "the primary qualities of things, which are discovered by our senses" and our complex ideas of "corporeal substances" that derive from sensible secondary qualities. Indeed, some predictions of what an operating room of the future will be like include scenarios where entire procedures are performed by computer-controlled robots, attached to advanced imaging devices (such as MRI machines) and working with digitally stored models of the patient (obtained from CAT scanners), whose motions are guided by artificial intelligence programs. For the moment, the physical limitations of robot mechanisms, the lack of adequate mathematical definitions of complex tissues (needed to program a robot), the massive computational demands such systems would place on computer processors, and the high costs of research and development keep such radical scenarios on the distant horizon of medical technology. Commercially available surgical robots, such as the da Vinci, have been approved for laparoscopic, minimally invasive surgery under the guidance of an attending physician, but completely autonomous robots have yet to be developed.
Alongside these practical reasons related to surgical safety and affordability, surgeons worry that they would no longer directly control the situation. Sophisticated computer-guided robots require their own specialist operators, which means that surgeons must pass commands to technicians rather than working directly on the physical body of the patient. As one medical team has written, doctors generally prefer to have a robot pre-programmed for a procedure under active control of the surgeon, because in this case "from both the surgeon's and the patient's point of view, the robot is merely a 'tool'. It is evident that the surgeon carries out the operation and not the robot." This suggests a lingering suspicion about the reliability of transferring data directly from electronic sensors to robotic actors, and implies that both sides of the surgical experience prefer an expert human to inhabit-or at least physically monitor-the circuit of information and action. There seems to be a reluctance to accept penetration of our bodies by a fully autonomous "apparatus" (Benjamin's term) of medical technology, regardless of its sophistication.
We open this book with an almost science-fiction account of modern medicine because it stages the experience of virtual reality in a markedly graphic manner, by placing a physical body under an actual scalpel guided through a fictive space of computer simulation. Underlying this unusual meeting of surgeon and patient, mediated almost exclusively by mechanical sensors, digital sampling, and algorithmic instruction sets, is an implicit confidence in the information delivered to the surgeon, in his ability to form a clear and accurate idea of the physical corrections to be made to the affected eye. Anyone would hope the surgeon has an accurate picture of the patient's condition so that the operation might be successful. Yet this raises a simple but profound question: is the helmet's stream of real-time data a description of the eye?
Proponents of a copy theory of representation would probably say "no." The paradox is that advocates of a non-mimetic theory of description, such as Nelson Goodman, would be hard-pressed to answer "yes." Goodman distinguishes description from depiction on the grounds that the former is syntactically "articulated" rather than "dense." By this he means that the components of descriptions are disjointed and measurably discontinuous from one another, whereas those of depictions appear indivisible-even though they may be infinitely subdivided to achieve higher resolution. The digitized sampling of data and its numeric displays in the surgeon's helmet surely qualify as articulate systems, while the real-time video image he views simultaneously provides a visual spectrum every bit as "dense" as would a conventional depiction.
What is unusual about the surgery example is the convergence of dissimilar data-a kind of willful grafting of Goodman's two syntactic schemes-in which both the surgeon and the patient have placed their trust. This trust does not develop because they are convinced that one sees the visual organ more completely in the helmet than with the naked eye, but because-for the highly specialized encounter of surgery-this is the most functional way of seeing it. The surgeon cares little if the patient has green or blue eyes, for example, and the helmet display ignores those qualities, yet it reports with great accuracy every minute change in the scalpel's position. So we will answer with a term employed by Goodman, but not used in his sense, that the digital data-stream is not a description of the eye but a diagram. A diagram is a proliferation of manifestly selective packets of dissimilar data correlated in an explicitly process-oriented array that has some of the attributes of a representation but is situated in the world like an object. Diagrams are closer in kind to a Jackson Pollock than to a Rembrandt.
Diagrams have existed for centuries. Our ambition is neither to write that long history nor to devise an all-inclusive, trans-historical definition. Nevertheless, we may enumerate some of their formal characteristics: they tend to be reductive renderings, usually executed as drawings, using few if any colors; they are generally supplemented with notations keyed to explanatory captions, with parts correlated by means of a geometric notational system. The Oxford English Dictionary's (OED) etymology of the word is somewhat broader, indicating that musical notations and written registers were part of its early usage. By the mid-nineteenth century, the OED reports that "diagram" was being used to "represent symbolically the course or results of any action or process, or the variations that characterize it." This emerging ability to concretize process forms the center of our book. The modern history of the word masks something implicit about the nature of diagrams that can be recovered by recalling the Greek use of diagramma in mathematical proofs. "The perceived diagram does not exhaust the geometrical object," writes Reviel Netz. "This object is partly defined by the text.... But the properties of the perceived diagram form a true subset of the real properties of the mathematical object. This is why diagrams are good to think with." It is significant that d'Alembert's short entry for "Diagramme" in the Encyclopedia elides the ancient and modern meanings: "It is a figure or construction of lines intended to explain or to demonstrate an assertion."
Between the early seventeenth century and the middle of the nineteenth century, diagrams were increasingly adapted to represent complex processes uncovered by scientific investigations or instantiated by mechanical inventions. Was this an accident? We argue that the hybrid visual attributes of diagrams facilitate their migration to these complex tasks of representation. The proliferation of discrete packets of dissimilar data, which characterizes diagrams, allows them to be apprehended in series or, paradoxically, from several vantage points. Their disunified field of presentation-ruptured by shifts in scale, focus, or resolution-provokes seriated cognitive processes demanding an active correlation of information. Our general approach in this book is to emphasize this potential for process-both cognitive process and historical process-implicit in the types of visual configurations usually called diagrams. Our earlier and later citations from the OED frame the eighteenth-century point of departure for this book-and the publication of Diderot and d'Alembert's Encyclopedia. The diagrammatic premises of their approach to visualizing knowledge are explored in Chapter 2.
Our view of Diderot and d'Alembert's intellectual project in the Encyclopedia is intertwined with our process-oriented concept of diagram, and more akin to the analysis of Jean Starobinski than that of Michel Foucault. For Foucault, the Encyclopedia is a table-an array for nearly unfettered inspection. For Starobinski, it is an arena of knowledge rife with internal discontinuities barely concealed by its "imposing facade, baroque in style and markedly stoic," behind which "spreads the completely modern activity of discontinuous appropriation that is quick to forget the outmoded constraints of organic unity." Starobinski suggests that the arbitrary alphabetic order of the Encyclopedia actually sows disorder by breaking with the closed loop of knowledge implicit in the form's history. Especially germane to Starobinski's account is his attention to the complex system of cross-references that work against the alphabetic arrangement and produce a secondary order-a proliferation of readings instituted by the cross-references but animated by the reader/ user's individual penchant to know. In the words of Annie Becq, at the heart of the Encyclopedia lies a paradox "that values continuity while recognizing in fact that discontinuity is necessary." This same paradox structures and animates what we call diagram.
We emphasize the Encyclopedia's proliferation of knowledge rather than focusing upon its disciplinary compartmentalization. We take our cue from the inclusionary mood of a familiar passage in Diderot's Prospectus:
That is what we had to explain to the public about the arts and sciences. The section on the industrial arts required no less detail and no less care. Never, perhaps, have so many difficulties been brought together with so few means of vanquishing them. Too much has been written about the sciences. Less has been written about most of the liberal arts. Almost nothing has been written about the industrial arts.
Diderot's ambition was to produce not only the great "interlinking of the sciences" that constitutes an encyclopedia, but also a catalogue of the bewildering diversity and density of activities that comprise human life. To make it possible for readers to find their way through this labyrinth, the authors adopted the alphabetic ordering of a dictionary which, as many writers have noted, established a tension between encyclopedic closure and a serialized accumulation of knowledge. Diderot himself was aware of the problem:
If one raises the objection that the alphabetical order will ruin the coherence of our system of human knowledge, we will reply: since that coherence depends less on the arrangement of topics than on their interconnections, nothing can destroy it; we will be careful to clarify it by the ordering of subjects within each article and by the accuracy and frequency of cross-references.
What Diderot describes here, notably in his attention to "interconnections" and the "frequency of the cross-references," is a process of learning and discovery that cuts across the dictionary order in complex and unpredictable ways. The aim of this process-which we take to be the essence of diagram-was:
to point out the indirect and direct links amongst natural creatures that have interested mankind; to demonstrate that the intertwining of both roots and branches makes it impossible to know well a few parts of this whole without going up or down many others.
The implication here, as Starobinski and Herbert Dieckmann suggest, is not that knowledge for Diderot entails absolute inclusiveness, but rather emerges from a process of enchaînement [linking] guided by cross-references; that is, from the user's active exercise of relational judgment.
Most critical readings of the Encyclopedia align its mode of presentation with a rationalist enterprise of analytic subdivision in which large and complex subjects are broken down-or fragmented-into small units of study. Our view, informed by Diderot's understanding of the relationship of parts to whole, is not to treat the entries or illustrations as fragments of an idealized entity, but as a proliferation of independent elements that, when interconnected, produce knowledge of the whole. The distinction is worth making, for it asks whether the Encyclopedia is written from the vantage point of absolute knowledge able to disassemble complex objects at will (Foucault's table) or is a collection of working objects-devised ad hoc in the manner of bricolage-that evoke the world's density from a finite number of material soundings.
Working objects are both the tools and products of research processes that in practice correlate familiar oppositions: word and image; representation and the real world; physical mechanics of vision and its processing in the mind; or Goodman's articulate and dense syntactic systems. Here, too, we recover the place of diagrams in Greek mathematical reasoning in which, according to Netz, "the diagram is not a representation of something else; it is the thing itself." Diagrams are things to work with. By framing our concept of diagram as a flexible tool of research, we link it to Diderot's idea that the Encyclopedia makes knowledge visible by its system of correlations [rapports] rather than its arrangement of materials. The Encyclopedia fails as a compendium, but establishes the matrix of diagrammatic knowledge.
Excerpted from THE CULTURE OF DIAGRAM by JOHN BENDER MICHAEL MARRINAN Copyright © 2010 by Board of Trustees of the Leland Stanford Junior University. Excerpted by permission.
All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.
Excerpts are provided by Dial-A-Book Inc. solely for the personal use of visitors to this web site.