After Thought: The Computer Challenge to Human Intelligence

After Thought: The Computer Challenge to Human Intelligence

by James Bailey
     
 

View All Available Formats & Editions

Cutting-edge computer scientists are coming to see that electronic circuits really are alien, that the difference between the human mind and computer capability is not merely one of degree (how fast), but of kind(how). The author suggests that computers “think” best when their “thoughts” are allowed to emerge from the interplay of millions

See more details below

Overview

Cutting-edge computer scientists are coming to see that electronic circuits really are alien, that the difference between the human mind and computer capability is not merely one of degree (how fast), but of kind(how). The author suggests that computers “think” best when their “thoughts” are allowed to emerge from the interplay of millions of tiny operations all interacting with each other in parallel. Why then, if computers bring to the table such very different strengths and weaknesses, are we still trying to program them to think like humans? A work that ranges widely over the history of ideas from Galileo to Newton to Darwin yet is just as comfortable in the cutting-edge world of parallel processing that is at this very moment yielding a new form of intelligence.

Editorial Reviews

New York Times Book Review
The wonder of Mr. Bailey's book is that he makes us aware of things abstract that all our lives we have been trained to think of as concrete.
New York Times
The wonder of Mr. Bailey's book is that he makes us aware of things abstract that all our lives we have been trained to think of as concrete.
Publishers Weekly - Publisher's Weekly
The true electronic revolution has not yet happened, proclaims Bailey. A new breed of computers is emerging, using parallel processing and new mathematics ("intermaths") with exotic names like cellular automata, genetic algorithms and artificial life, which enable computers to continually change their own programs as they compute. Instead of the traditional mathematical vocabulary of numbers, symbols and equations, these computers emphasize emergent patterns, enabling scientists to investigate a world of perpetual novelty. The new computers are being used to analyze the behavior of bird flocks and consumers, to study the human immune system, to make financial decisions and to contour the molecular structure of effective drugs. Freelancer Bailey, a former executive at Thinking Machines Corp., predicts that the new computers will create their own versions of scientific theories and help us fathom biological and cultural evolution as well as the workings of the mind. This is a thoughtful, exciting preview of the dawning age of computing. (July)
Library Journal
Bailey, a former executive at Thinking Machines, the manufacturer of one of the earliest lines of parallel processor computers, argues that computers using parallel processing, as opposed to traditional linear processing, will change the way we understand intelligence. Drawing on stories and examples from Galileo to contemporary thinkers, he seeks to explain why the parallel-processing approach will revolutionize information processing and analysis. Some of his examples and analogies are straightforward and understandable, but too often he makes unclear chronological and conceptual jumps. Part philosophy, part history of science, part computer science history, and part technological prediction, this book is difficult to follow and thus unconvincing. For academic libraries.Hilary Burton, Lawrence Livermore National Lab., Livermore, Cal.
Ray Duncan

After Thought

After Thought's essential premise is that the old ways of understanding the world are about to be swept away by new mathematical methods called "intermaths", much as the ancients' descriptive, geometry-based understanding was swept away by Newton's analytical approach. By helping us understand the first transition, author James Bailey hopes to provide a conceptual model for the second.

"At first the transition to a new vocabulary does not affect the way we understand the world itself. The epicycles of the moon's behavior, for example, were simply reexpressed with sines and cosines instead of with circles and lines. Soon, however, a change of vocabulary is accompanied by a much deeper change in understanding. Scientists choose underlying fictions that are well-adapted to their vocabulary. Where once they looked up into the sky and saw epicycles, for example, they -- and we -- came to see fields of gravitational force instead." -- After Thought, page 80.

After Thought is at its best in the first two sections, wherein Bailey takes us inside the classical mind and then through the paradigm shift that occurred with the invention of the calculus. The presentation is scholarly yet engrossing, synthesizing concepts and figures from hundreds of years of scientific and philosophical literature, Descartes side by side with Thoreau. The history of how the organization and methods of the human "computers" of yore have been reflected in -- and, indeed, severely constrain -- the design of modern digital computers was new to me and I found it quite interesting.

In the third section of the book, Bailey turns to the "intermaths" and discusses a grab bag of computational techniques and topics including neural nets, genetic algorithms, cellular automata, classifier systems, emergent phenomena, fractals, chaos theory, agents, simulation, and even weather and economic modeling. One particularly interesting passage describes a mechanistic implementation of a neural net using a class of students (page 127-128). The explosion of the Internet and the possibility of integrating the processing power now isolated on a hundred million desktops are also woven into the argument.

In the end, After Thought becomes an appeal to faith. Bailey predicts that the "intermaths" will make current technology and "Cartesian thinking" obsolete, but offers only a muddle of anecdotes to support those prophecies. His discussions of operational hardware are limited to Danny Hillis' "Connection Machine," but that product and its parent corporation Thinking Machines Corporation are now defunct. His most convincing case histories are those that use neural nets, but those do not in fact require any radical changes in how computers are organized, or even rely on parallel computation to be useful.

Are the proponents of the "intermaths" merely fringe players? Or are they visionaries trying to implement brilliant ideas with inappropriate hardware -- like Charles Babbage and the Difference Engine? Only time will tell.--Dr.Dobb's Electronic Review of Computer Books

Kirkus Reviews
Computer-aided math is now at a point where unaided human intelligence cannot follow the proofs, a fact that has profound implications for future science, according to James (a former executive at Thinking Machines Corp.).

He illustrates this thesis by summarizing the role of different forms of math in the history of science and philosophy. Ptolemy constructed his astronomical theory on a geometrical basis of perfect circles. But when astronomers (notably Tycho Brahe) began to collect data that failed to fit the theory, new mathematical tools became necessary to construct a more accurate model of the cosmos: first algebra, then calculus. Descartes's step-by-step sequential method was matched to the strengths of the human mind and gained its most impressive results from a miserly amount of data. But physical scientists came to scorn "mere" data collection. A true scientist worked to discover abstract theoretical principles; collecting data and doing arithmetic were the jobs of assistants. The earliest computers mimicked the methods of human calculators; their main advantages were increased speed and almost perfect accuracy. Advanced computers change all that, handling incredible floods of data with ridiculous ease—and in many cases, in parallel streams. It is no longer unthinkable to simply pile up huge quantities of fact and analyze the resulting patterns. The implications of this are most profound in disciplines to which the sequential maths were least adaptable: meteorology, biology, and economics, all of which generate enormous masses of seemingly chaotic data. The computers can analyze these data and discover patterns, even though the programmers can no longer follow their "reasoning." What this finally means is that we humans will increasingly have to accept computers as equal partners in the enterprise of science—and to accept as valid computer-generated results we cannot begin to understand.

A fascinating tour of scientific history, concluding with a vision of a future that is at once exhilarating and profoundly unsettling.

Read More

Product Details

ISBN-13:
9780465007820
Publisher:
Basic Books
Publication date:
05/15/1997
Edition description:
Reprint
Pages:
288
Product dimensions:
5.34(w) x 8.02(h) x 0.72(d)
Lexile:
1360L (what's this?)

What People are saying about this

Mitchel Resnick
"Bailey understands what makes the digital revolution truly revolutionary."
J. Beidler
"Essential for all interested in computerized problem-solving."
David Waltz
"A book that is at the same time a fascinating history of computation and science, and a deep analysis of the major shifts of human thought as a result of this history."
John B. Cobb
"A frightening and exciting account of how technology has transformed and will transform us."

Customer Reviews

Average Review:

Write a Review

and post it to your social network

     

Most Helpful Customer Reviews

See all customer reviews >