The Barnes & Noble Review
The First Man on the Web The story of the creation of the World Wide Web is the newest update on the classic story of technological invention. Like the telegraph, the telephone, and countless other giant leaps forward that transformed our culture, the Web was the product of a lone genius (and his assistants) laboring away in relative obscurity on a project for which none but he saw the potential.
Like all the classic tales of brilliant invention, this simplified version of the story barely does justice to the years of work and breadth of vision that went into making the Web and, by extension, the whole Internet the fastest-growing communications system in all of human history. Thankfully, to fill in the holes left by the shorthand version, the inventor himself, Tim Berners-Lee, has put the story of his work into Weaving the Web.
To casual users and online neophytes, it might seem like the Web appeared overnight, that one day back in 1995, there was no Web and the next day, everyone was talking about it. And compared with earlier technological shifts such as radio, television, and personal computers, the speed of the Web's growth was lightning fast. But it wasn't overnight. Fifteen years before every ad began including a dot com address and every hot stock was an Internet IPO, Tim Berners-Lee, a computer programmer working at CERN, the European Laboratory for Particle Physics, wrote the first computer program that vaguely resembled today's Web.TheWeb itself was still ten years away. The first web site and server went up in 1990. The first people who could see it got browsers in 1991, all of them working at the same physics lab and all of them using powerful but relatively uncommon NeXT computers. They spread the news to other physics labs around the world. By the middle of 1991, there were no more than 100 hits per day on any web page. A year later there were still only a handful of web servers and a few dozen users. Unix machines could now browse the Web, but there was still no software to let people running Macintosh, Windows, or DOS machines see the Web.
In telling this story, Weaving the Web serves several purposes. First, it sets the record straight. Mark Andreessen, who helped write the first popular Windows browser, Mosaic, and later founded Netscape, has become such a pop-culture figure that he now appears in beer commercials. But there could have been no Netscape without Berners-Lee. There would be no Internet billionaires without Berners-Lee and his team.
Second, and not inconsequential, Weaving the Web is a compendium of geek trivia. Did you know the first web site was http://info.cern.ch? That if not for a mix of absentminded and, occasionally, forward-thinking decisions on the part of CERN, there would be no Web? That the busiest site in the world was getting just 10,000 hits per day in 1993?
The third and perhaps most important purpose of Weaving the Web is to give Berners-Lee a forum to restate firmly his vision of the Web as an information-exchange medium rooted in open standards with the broadest participation possible. The idea of a few large media companies pushing information at millions of passive Web surfers is not his ideal. The notion that any company should control any aspect of the fundamental Web technologies is his worst nightmare.
It is refreshing to be reminded that the Web was not invented to be a virtual mall and that in Berners-Lee's original vision, publishing onto the Web was as simple as reading from it. Berners-Lee still works as a guardian of his invention as the director of the World Wide Web Consortium, a research and industry group that designs and issues recommended standards for new Web technologies. And it is clear from Weaving the Web that the future of this powerful medium has a smart and influential advocate in its inventor.
The inventor of the World Wide Web has finally penned his story. He had to. He's a bit tired of having to explain to people (especially reporters) where his millions are -- or aren't, as is the case.
Tim Berners-Lee, 44, the British physicist who directs the World Wide Web Consortium at the Massachusetts Institute of Technology, has had loftier goals than money-making in the 10 years since he conceived of one of the technological world's greatest innovations. Not that there's anything wrong with making money, he insists. But Berners-Lee has dedicated his work to the health and continued growth of the Web, which now means heading the world's coordinating body for Web development instead of founding the latest Internet startup.
In his new book, Weaving the Web, written with Mark Fischetti, Berners-Lee details the confluence of ideas, technological developments and influences that enabled him to conceive of the Web. He also talks about the many stumbling blocks he encountered. And then he goes on to persuade his readers that they, too, should be more concerned and more involved with directing the course of this new medium.
Written in an easy-to-read, conversational style, the book may finally bring about what Berners-Lee's modesty has helped prevent in the past. He may finally achieve in the public eye, instead of only in the annals of technology history, his rightful place as creator of the Web. He's not only not as rich as Bill Gates or Marc Andreessen -- he hasn't been as well recognized, either.
While acclaim has been slow in coming for the publicity-shy Berners-Lee, it is finally arriving. Last year, he won a prestigious MacArthur Foundation "genius" grant to continue his work on standards development with the W3C. In March, Time listed him as one of the 100 greatest minds of the 20th century.
Amazing as it may seem now, Berners- Lee actually had a tough time convincing people that he was on to something. In 1989, he was working at the CERN particle-physics laboratory in Geneva when he proposed a "global hypertext project."
"Imagine making a large three-dimensional model, with people represented by little spheres, and strings between people who have something in common at work," he wrote in his proposal. "Perhaps a linked information system will allow us to see the real structure of the organization in which we work."
His story of the Web's development actually starts much earlier. Unlike Newton getting beaned by an apple, the concept of the Web never came down to one "Eureka!" moment, he insists. The son of mathematicians, Berners-Lee was intrigued early on by the idea that a computer could be programmed to complete connections similar to the way the brain does. When he first took a software-consulting job with CERN in 1980, he wrote a program called Enquire to help him remember connections between the various people, computers and projects at the esteemed laboratory. The program gave him larger ideas. "Suppose all the information stored on computers everywhere were linked," he recalls thinking.
After getting the go-ahead at CERN for his global project, Berners-Lee started writing the underpinnings of the Web. He eventually wrote the hypertext markup language, or HTML; the hypertext transfer protocol, or "http://" that today begins the many million Web addresses; and the uniform resource locators, or URLs, that have become akin to the telephone numbers of cyberspace.
Curiously, technology companies were slow to grasp what was in the offing. They couldn't understand how a system that was decentralized and had no central repository of data would thrive. Berners-Lee recalls being shot down at his first Internet Engineering Task Force meeting for his suggestion about creating a "universal document identifier."
"How could I be so presumptuous as to define my creation as 'universal'?" he recalls sensing. But the Web's following did build in a grassroots sort of way. In July and August 1991, the Web server at CERN was getting from 10 to 100 hits per day, which would soon begin to double every few months until technicians were recording 10,000 hits per day in 1993.
Some of the most interesting insights into the coming commercialization of the Web are recounted by Berners-Lee, who in 1993 visited the National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign, where student Marc Andreessen and staff member Eric Bina were creating the Mosaic Web browser.
"It was becoming clear to me in the days before I went to Chicago that the people at NCSA were attempting to portray themselves as the center of Web development, and to basically rename the Web as Mosaic," he recounts. "At NCSA, something wasn't 'on the Web,' it was 'on Mosaic.'"
But Berners-Lee gives credit where credit is due. He points out that Andreessen, who would later hook up with entrepreneur Jim Clark to found Netscape Communications (and hire away the NCSA team), did work that eventually led to the more widespread popularity of the Web.
Berners-Lee admits he has toyed with the idea of starting a company. He opted instead to start the W3C. "My motivation was to make sure that the Web became what I originally intended it to be -- a universal medium for sharing information," he writes. "Starting a company would not have done much to further this goal, and it would have risked the prompting of competition, which would have turned the Web into a bunch of proprietary products."
By following the consortium route, which led him to relocate from CERN to MIT in Cambridge, Mass., Berners-Lee has been able to maintain a neutral viewpoint. Among the initiatives the W3C has developed with its members, which include the top Internet and computer companies, are the Platform for Internet Content Selection rating system, the Platform for Privacy Protection and Extensible Markup Language.
Berners-Lee says he wrote the book not only to tell the story of the Web but also to warn that his goal for the Web -- to link credible information worldwide -- could be destroyed. He fears the growing consolidation of Internet service providers and the potential for one company to exert control over the medium and the content of the Web. In addition, he says, it's important for Internet companies to distinguish between independent advice and that which comes from paid partners. "The medium can be perverted, giving you what seems to be the world, but in fact is a tilted and twisted version," he writes.
This comes from a man who chose to maintain his integrity in the high-tech world rather than cash in. Had he taken the path to riches, the Web as we know it may never have been born.
reading. It is a book written by the ultimate good
guy...He goes out of his way to make it clear
that he was not the only one to have done all the
NY Times Book Review
Publishers Weekly - Publisher's Weekly
This lucid but impersonal memoir conveys some vital history and intriguing philosophy concerning the Internet, written by the man who invented such ubiquitous terms as URL, HTML and World Wide Web. British-born physicist Berners-Lee is now the director of the World Wide Web Consortium, which is based at MIT and sets software standards for the Web. In the late 1980s, he wrote the first programs that set up the Web, thus revolutionizing the Internet by allowing users to hyperlink among the world's computers. It was a quantum conceptual leap, and not everyone instantly understood it (some researchers had to be convinced that posting information was better than writing custom programs to transfer it). The release of graphical browsers such as Netscape Navigator made the Web much easier for home users to navigate and led to the commercialization of the Net. Although Berners-Lee calmly eschewed opportunities to get rich, he doesn't subscribe to the notion, common among pre-Web denizens of the Internet, that commercialization is a pox upon cyberspace. After short takes on current issues like privacy and pornography, Berners-Lee moves into prediction and prescription: the Web needs more intuitive interfaces and integration of tools, "annotation servers" that allow comments to be posted on documents and "social machines" that enable national plebiscites. And while he's no digital utopian, he thinks an Internet that balances decentralization and centralization can contribute to a more harmonious society. Berners-Lee's tone is more lofty than quotidian. He'd rather muse about the benefits of decentralization that his revolutionary technology makes possible than respond to Internet skeptics and critics. But he was very, very right a decade ago, and he's well worth reading now. Copyright 1999 Cahners Business Information.
The inventor of the World Wide Web tells how he did it, and what it means. Berners-Lee traces the Web to a "play" program he invented in 1980, while a consultant at the CERN laboratories in Switzerland. "Enquire," named after a Victorian advice book, stored information about the job in a nonhierarchical manner. The program impressed those who saw it, but nobody used it, and the disk containing it was eventually lost. But Berners-Lee was still interested in using computers to connect ideas. Returning to CERN a few years later, he began re-creating Enquire. One problem was allowing workers to use their preferred software without imposing a complex set of new rules governing access to the Web. Hypertext, which allowed any document to be linked to any other on the system, was the key to solving this problem. Also, by then, the Internet was beginning to come into existence in the US. Its standardized protocols seemed an ideal way to bridge between operating systems. By 1989, he was ready to create the Web. Progress was swift; within a year of its introduction, the number of users was doubling every three to four months. Berners-Lee acted as pitchman, convincing different groups to adopt standards that would increase the accessibility and utility of the Web. At last, CERN released the basic Web code and protocols into the public domain, without licensing feesa step that insured that no hardware or software company would stand in the way of their propagation. Intent on keeping the Web universally accessible, Berners-Lee became head of the MIT-based World Wide Web Consortium, the arbiter of Web standards. In the final chapters of his book, he describes his visions for the Web: as amedium for collaboration, person-to-person and person-to-computer, ultimately restructuring society. Anyone who uses his invention can see how far that process has already come. A compelling combination of techno-history and visionary philosophy.
Read an Excerpt
Enquire Within upon Everything
When I first began tinkering with a software program that even-tually gave rise to the idea of the World Wide Web, I named it Enquire, short for Enquire Within upon Everything, a musty old book of Victorian advice I noticed as a child in my parents' house outside London. With its title suggestive of magic, the book served as a portal to a world of information, everything from how to remove clothing stains to tips on investing money. Not a perfect analogy for the Web, but a primitive starting point.
What that first bit of Enquire code led me to was something much larger, a vision encompassing the decentralized, organic growth of ideas, technology, and society. The vision I have for the 'Web is about anything being potentially connected with anything. It is a vision that provides us with new freedom, and allows us to grow faster than we ever could when we were fettered by the hierarchical classification systems into which we bound our-selves. It leaves the entirety of our previous ways of working as just one tool among many. It leaves our previous fears for the future as one set among many. And it brings the workings of society closer to the workings of our minds.
Unlike Enquire Within upon Everything, the Web that I have tried to foster is, not merely a vein of information to be mined, nor is it just a reference or research tool. Despite the fact that the ubiquitous www and .com now fuel electronic commerce and stock markets all over the world, this is a large, but just one, part of the Web. Buying books from Amazon.com and stocks from E-trade is not all there is to the Web. Neither is the Web some idealized space where we must remove our shoes, eat only fallen fruit, and eschew commercialization.
The irony is that in all its various guises-commerce, research, and surfing-the Web is already so much a part of our lives that familiarity has clouded our perception of the Web itself. To understand the Web in the broadest and deepest sense, to fully partake of the vision that I and my colleagues share, one must understand how the Web came to be.
The story of how the Web was created has been told in various books and magazines. Many accounts I've read have been distorted or just plain wrong. The Web resulted from many influences on my mind, half-formed thoughts, disparate conversations, and seem-ingly disconnected experiments. I pieced it together as I pursued my regular work and personal life. I articulated the vision, wrote the first Web programs, and came up with the now pervasive acronyms URL (then UDI), HTTP, HTML, and, of course, World Wide Web. But many other people, most of them unknown, con-tributed essential ingredients, in much the same almost random fashion. A group of individuals holding a common dream and working together at a distance brought about a great change.
My telling of the real story will show how the Web's evolu-tion and its essence are inextricably linked. Only by understand- ing the Web at this deeper level will people ever truly grasp what its full potential can be.
Journalists have always asked me what the crucial idea was, or what the singular event was, that allowed the Web to exist one day when it hadn't the day before. They are frustrated when I tell them there was no "Eureka!" moment. It was not like the legendary apple falling on Newton's head to demonstrate the concept of gravity. Inventing the World Wide Web involved my growing realization that there was a power in arranging ideas in an unconstrained, weblike way. And that awareness came to me through precisely that kind of process. The Web arose as the answer to an open challenge, through the swirling together of influences, ideas, and realizations from many sides, until, by the wondrous offices of the human mind, a new concept jelled. It was a process of accretion, not the linear solving of one well-defined problem after another.
I am the son of mathematicians. My mother and father were part of the team that programmed the world's first commercial, stored-program computer, the Manchester University 'Mark I,' which was sold by Ferranti Ltd. in the early 1950s. The were full of excitement over the idea that, in principle, a person could program a computer to do most anything. They also knew, however, that computers were good at logical organizing and process-ing, but not random associations. A computer typically keeps information in rigid hierarchies and matrices, whereas the human mind has the special ability to link random bits of data. When I smell coffee, strong and stale, I may find myself again in a small room over a corner coffeehouse in Oxford. My brain makes a link, and instantly transports me there.
One day when I came home from high school, I found my father working on a speech for Basil de Ferranti. He was reading books on the brain, looking for clues about how to make a com-puter intuitive, able to complete connections as the brain did. We discussed the point; then my father went on to his speech and I went on to my homework. But the idea stayed with me that com-puters could become much more powerful if they could be pro-grammed to link otherwise unconnected information.
This challenge stayed on my mind throughout my studies at Queen's College at Oxford University, where I graduated in 1976 with a degree in physics. It remained in the background when I built my own computer with an early microprocessor, an old television, and a soldering iron, as well as during the few years I spent as a software engineer with Plessey Telecommunications and with D.G. Nash Ltd.
Then, in 1980, 1 took a brief software consulting job with CERN the famous European Particle Physics Laboratory in Geneva. That's where I wrote Enquire, my first weblike program. I wrote it in my spare time and for my personal use, and for no loftier reason than to help me remember the connections among the various people, computers, and projects at the lab. Still, the larger vision had taken firm root in my consciousness.