After Thought: The Computer Challenge to Human Intelligenceby James Bailey
Cutting-edge computer scientists are coming to see that electronic circuits really are alien, that the difference between the human mind and computer capability is not merely one of degree (how fast), but of kind(how). The author suggests that computers “think” best when their “thoughts” are allowed to emerge from the interplay of millions… See more details below
Cutting-edge computer scientists are coming to see that electronic circuits really are alien, that the difference between the human mind and computer capability is not merely one of degree (how fast), but of kind(how). The author suggests that computers “think” best when their “thoughts” are allowed to emerge from the interplay of millions of tiny operations all interacting with each other in parallel. Why then, if computers bring to the table such very different strengths and weaknesses, are we still trying to program them to think like humans? A work that ranges widely over the history of ideas from Galileo to Newton to Darwin yet is just as comfortable in the cutting-edge world of parallel processing that is at this very moment yielding a new form of intelligence.
After Thought's essential premise is that the old ways of understanding the world are about to be swept away by new mathematical methods called "intermaths", much as the ancients' descriptive, geometry-based understanding was swept away by Newton's analytical approach. By helping us understand the first transition, author James Bailey hopes to provide a conceptual model for the second.
"At first the transition to a new vocabulary does not affect the way we understand the world itself. The epicycles of the moon's behavior, for example, were simply reexpressed with sines and cosines instead of with circles and lines. Soon, however, a change of vocabulary is accompanied by a much deeper change in understanding. Scientists choose underlying fictions that are well-adapted to their vocabulary. Where once they looked up into the sky and saw epicycles, for example, they -- and we -- came to see fields of gravitational force instead." -- After Thought, page 80.
After Thought is at its best in the first two sections, wherein Bailey takes us inside the classical mind and then through the paradigm shift that occurred with the invention of the calculus. The presentation is scholarly yet engrossing, synthesizing concepts and figures from hundreds of years of scientific and philosophical literature, Descartes side by side with Thoreau. The history of how the organization and methods of the human "computers" of yore have been reflected in -- and, indeed, severely constrain -- the design of modern digital computers was new to me and I found it quite interesting.
In the third section of the book, Bailey turns to the "intermaths" and discusses a grab bag of computational techniques and topics including neural nets, genetic algorithms, cellular automata, classifier systems, emergent phenomena, fractals, chaos theory, agents, simulation, and even weather and economic modeling. One particularly interesting passage describes a mechanistic implementation of a neural net using a class of students (page 127-128). The explosion of the Internet and the possibility of integrating the processing power now isolated on a hundred million desktops are also woven into the argument.
In the end, After Thought becomes an appeal to faith. Bailey predicts that the "intermaths" will make current technology and "Cartesian thinking" obsolete, but offers only a muddle of anecdotes to support those prophecies. His discussions of operational hardware are limited to Danny Hillis' "Connection Machine," but that product and its parent corporation Thinking Machines Corporation are now defunct. His most convincing case histories are those that use neural nets, but those do not in fact require any radical changes in how computers are organized, or even rely on parallel computation to be useful.
Are the proponents of the "intermaths" merely fringe players? Or are they visionaries trying to implement brilliant ideas with inappropriate hardware -- like Charles Babbage and the Difference Engine? Only time will tell.--Dr.Dobb's Electronic Review of Computer Books
He illustrates this thesis by summarizing the role of different forms of math in the history of science and philosophy. Ptolemy constructed his astronomical theory on a geometrical basis of perfect circles. But when astronomers (notably Tycho Brahe) began to collect data that failed to fit the theory, new mathematical tools became necessary to construct a more accurate model of the cosmos: first algebra, then calculus. Descartes's step-by-step sequential method was matched to the strengths of the human mind and gained its most impressive results from a miserly amount of data. But physical scientists came to scorn "mere" data collection. A true scientist worked to discover abstract theoretical principles; collecting data and doing arithmetic were the jobs of assistants. The earliest computers mimicked the methods of human calculators; their main advantages were increased speed and almost perfect accuracy. Advanced computers change all that, handling incredible floods of data with ridiculous easeand in many cases, in parallel streams. It is no longer unthinkable to simply pile up huge quantities of fact and analyze the resulting patterns. The implications of this are most profound in disciplines to which the sequential maths were least adaptable: meteorology, biology, and economics, all of which generate enormous masses of seemingly chaotic data. The computers can analyze these data and discover patterns, even though the programmers can no longer follow their "reasoning." What this finally means is that we humans will increasingly have to accept computers as equal partners in the enterprise of scienceand to accept as valid computer-generated results we cannot begin to understand.
A fascinating tour of scientific history, concluding with a vision of a future that is at once exhilarating and profoundly unsettling.
What People are saying about this
and post it to your social network
Most Helpful Customer Reviews
See all customer reviews >