The Barnes & Noble Review
Does the remarkable progression of computerized technology frighten or confuse you? Does it excite you? Do you wonder, amid all the new developments and prophecies made by the communications industry, what your life will be like in the 21st century? Michael Dertouzos, director of the renowned Laboratory for Computer Science at MIT, has written a new book entitled What Will Be, an inside view of how information technology will shape our world in the coming millennium.
Professor Dertouzos is not an outside commentator making unfounded forecasts about our future. His is a voice of experience. MIT has one of the premier computer research laboratories in the world, and Dertouzos has been at the helm for 20 years. His visions have proven legitimate time and again: In 1981, he described a "global marketplace" in place by the 21st century, "where people and computers buy, sell, and freely exchange information and information services," which sounds like a very early description of what we know today as the Internet. Dertouzos's prognosis is for the Internet's content to shift from the predominant film, music, and news sites of today to a new business marketplace, where human office work will take precedence.
As you form your own opinion about the changes in technology that lie ahead, What Will Be encourages you to ponder these new ideas and more: "Bodynets" will allow you to make phone calls and pay bills as you walk down the street; speech recognition programs will make keyboards, windows, and menus obsolete; health care costscouldplummet when doctors begin to bid for patients' attention online; email will cease to be typed messages and become transmitted "experiences." The information marketplace of tomorrow stands to generate $9 trillion per year, one-half of the industrial world's GNP. Information technology could coax nations to expand beyond their traditional geographic boundaries and become "networks." It is all possible, though according to Professor Dertouzos, observers who publicly herald the arrival of the "technological revolution" are premature. The world has only seen a fraction of "what will be," says the author, and everyone needs to know how the innovations of the future will affect our relationship to one another.
Publishers Weekly - Cahners\\Publishers_Weekly
Dertouzos is head of MIT's Laboratory for Computer Science and author of Made in America: Regaining the Productive Edge. Thus his prognostications concerning the impact of new and emerging information technologies on education, business, entertainment, manufacturing, government, politics and daily life are definitely worth considering. He envisions a world of automated kitchens, smart cards that eliminate currency, interactive art forms and automobiles equipped with navigational systems. Groupwork and telework modules will enable individuals in different locations to work on a task simultaneously with colleagues around the globe. Health care consumers will electronically access their medical records and consult specialists online. Despite the pedestrian, tutorial writing style, Dertouzos has a command of specifics that makes this more than just a pie-in-the-sky exercise. His book is a mind-expanding roadmap to the coming information revolution.
Read an Excerpt
Shaping the Future
A Home for the Web
The visitors in my office, acquaintances from my native Greece, were touring MIT with their son, who had applied for admission. It was February 1995 in Cambridge, Massachusetts, and the annual ritual of admission was once again under way. The trees outside my ground-floor windows at the MIT Laboratory for Computer Science were dormant, but hopes for college careers were budding.
We were discussing MIT's 150-year tradition of not giving honorary doctorates to anyone, however famous, and many other characteristics of this great institution that made it so attractive to students and faculty alike. Suddenly, my assistant, Anne, appeared in my doorway. "Michael, they need you on the third floor. It's urgent!" I excused myself and rushed out.
I could sense trouble as soon as I got off the elevator. Four members of the team responsible for the World Wide Webthe computer network scheme that had taken the world by stormwere huddled in animated debate over newspapers and e-mail printouts. Two others were on their phones talking with so much artificial calm that it must have been to the press. They briefed me.
It had all started innocently enough the previous day, during a meeting on computer security organized by the Web Consortium,a group at that time of fifty organizations worldwide led by MIT and its European partner INRIA, which strives to push forward the Web standards. At the meeting, chaired by Tim Berners-Lee, inventor of the Web and director of the consortium, a member had asked for a casual show of hands as to which of two proposed security standards the members preferred, based on what theyknew so far. Someone had leaked the straw-vote results, and this morning's headlines read: World Wide Web Consortium decides on Web security standard. The folks at Netscape, the leading provider of software for navigatingthe Web, had sent us e-mail threatening to walk out of the consortium because the "chosen" standard was not their favorite. Other consortium members were complaining that they hadn't been consulted. The team was now smoothing their feathers. Albert Vezza, associate director of our lab, was explaining to the reporter who wrote the story why it was wrong; a retraction would be issued the following day. Though I was director of the Laboratory for Computer Science and thus ultimately responsible for the Web Consortium and its activities, there was little for me to do. They were making all the right decisions. I told them so, and urged them to stay cool.
Back in the elevator, I mused that this way of pushing the technological frontier was not exactly what I had envisioned when four decades earlier, as a teenager in the United States Information Service Library of my hometown, Athens, I had come upon the design of a motorized mouse that could find its way through an arbitrary maze. My heart and mind were totally captured by this little machine. I knew that designing mechanical mice at MIT was what I would do for a living. I wasn't aware that the designer of that machine, who would become a colleague, was the celebrated Claude Shannon, who pioneered Information Theory and made the word bit something of a celebrity. Nor could I have known that the tiny robot was one of the many crucial advances in a long technical chain that would lead to computers and eventually the World Wide Web.
On this Tuesday almost halfway through the 1990s, we at the MIT Laboratory for Computer Science were still inventing exciting hardware, like bodynets that can link small computerized devices on our eyeglasses and belts with others in our cars and homes, or software that can hold a conversation with a human. But technology had grown to affect the world so profoundly, to become so intertwined with human activity, that it was no longer an isolated pursuit. The rumble blaming technology for the world's ills had long been rising. So it was not surprising to me to have a crisis at the nerve center of the Web that was sociotechnical in nature. Already, in two short years the Web had shed its techie aura and become a major cultural movement involving millions of people. The tens of millions of Web users, from homeowners to CEOs, were growing in number at an alarming rate, adding daily to the cumulative web of information by posting their own "home pages" that described their interests and needs and included writings and other offerings. The (computer) mouse clicks of all these people, like twists on millions of door handles, were opening countless doors to information, fun, adventure, commerce, knowledge, and all kinds of surprises at millions of sitesdown the street or a continent away.
Clearly, the new world of information was already affecting everyone's lives. Yet I knew that its present impact paled in comparison to what would be coming in the next several decades. While the media continued to flash old news about information highways, electronic mail, multimedia CD-ROMs, virtual reality, even the Web, newer and more fascinating technologies were already being prototyped in our lab and others around the globe. Meanwhile, the world's economies were getting ready to surrender a huge chunk of themselves to the activities that would stem from these technologies. And the envisioned activities, in turn, were already raising complex new social issues.
It was natural for the media to seize on exciting gadgetry it could already see and understand. But the press was missing much more startling research at labs it never bothered to exploreor that it found "boring" because the technology didn't have adrenal shock value or immediate impact on our lives. On the social and political fronts, too, it was more current to debate pornography on
the Internet than the future prospects for war and peace that the Information Age might bring. Mantras like "It's all about interactive TV" and "The medium is the message" were clouding the bigger picture. In a quiet but relentless way, information technology would soon change the world so profoundly that the movement would claim its place in history as a socioeconomic revolution equal in scale and impact to the two industrial revolutions.
Information technology would alter how we work and play, but more important, it would revise deeper aspects of our lives and of humanity: how we receive health care, how our children learn, how the elderly remain connected to society, how governments conduct their affairs, how ethnic groups preserve their heritage, whose voices are heard, even how nations are formed. It would also present serious challenges: poor people might get poorer and sicker; criminals and insurance companies and employers might invade our bank accounts, medical files, and personal correspondence. Ultimately, the Information Revolution would even bring closer together the polarized views of technologists who worship scientific reason and humanists who worship faith in humanity. Most people had no idea that there was a tidal wave rushing toward them.
I returned to my office and my old friend and his family. They thanked me for my time and left. I would find the son's name on the freshman class list that fall. Good for him; he had won a golden opportunity to see the tidal wave at close range, maybe even make some waves himself.
Our lab became the Web's home through a combination of chance and planning by many people. Three years after inventing the Web, Tim Berners-Lee, still at the CERN Physics Laboratory in Geneva, had begun looking for an institution that would help his brainchild grow. He had been offered opportunities to market the Web by starting or joining a company, thereby entering the club of Internet millionaires. But his idealism, his wish to make the Web a public resource, pushed him to search for a neutral institution. On this side of the Atlantic, as director of a lab that aspired to design the information infrastructures for tomorrow's society, I was looking for a way to bring the lab's celebrated researchers closer to the growing millions of Internet users. We heard of each other's interest and got together. After a dinner in Zurich and a couple of meetings in Boston, we realized we shared the same basic ideas. More important, the chemistry between us seemed right. We felt that we could trust each other.
On February 24, 1994, we clinched the deal. The Web Consor-tium was planned and formed by Albert Vezza. Tim, who joined MIT and our lab, became its director. Consortium members would pay an annual fee of either $5,000 or $50,000 based on their size. The fee would buy each company and university, large or small, an equal seat around the table where they would debate the future directions of the Web under Tim's leadership and would try to keep it from breaking up into different Web dialects. Within a year giants like AT&T, Microsoft, and Sony had joined, as had innovators like Netscape and Sun Microsystems. By mid-1996, the Web Consortium had 150 member organizations.