Managing Open Source Projects: A Wiley Tech Brief

Overview

The only guide to managing and integrating the open source model

With the phenomenal success of Linux, companies are taking open source business solutions much more seriously than ever before. This book helps to satisfy the growing demand for guidance on how to manage open source enterprise development projects. Expert Jan Sandred explores the open source philosophy, describes current software tools for managing open source projects, and provides expert guidance on how to ...

See more details below
Available through our Marketplace sellers.
Other sellers (Paperback)
  • All (8) from $1.99   
  • New (1) from $80.00   
  • Used (7) from $1.99   
Close
Sort by
Page 1 of 1
Showing All
Note: Marketplace items are not eligible for any BN.com coupons and promotions
$80.00
Seller since 2014

Feedback rating:

(165)

Condition:

New — never opened or used in original packaging.

Like New — packaging may have been opened. A "Like New" item is suitable to give as a gift.

Very Good — may have minor signs of wear on packaging but item works perfectly and has no damage.

Good — item is in good condition but packaging may have signs of shelf wear/aging or torn packaging. All specific defects should be noted in the Comments section associated with each item.

Acceptable — item is in working order but may show signs of wear such as scratches or torn packaging. All specific defects should be noted in the Comments section associated with each item.

Used — An item that has been opened and may show signs of wear. All specific defects should be noted in the Comments section associated with each item.

Refurbished — A used item that has been renewed or updated and verified to be in proper working condition. Not necessarily completed by the original manufacturer.

New
Brand new.

Ships from: acton, MA

Usually ships in 1-2 business days

  • Standard, 48 States
  • Standard (AK, HI)
Page 1 of 1
Showing All
Close
Sort by
Sending request ...

Overview

The only guide to managing and integrating the open source model

With the phenomenal success of Linux, companies are taking open source business solutions much more seriously than ever before. This book helps to satisfy the growing demand for guidance on how to manage open source enterprise development projects. Expert Jan Sandred explores the open source philosophy, describes current software tools for managing open source projects, and provides expert guidance on how to organize and manage open source projects using the Internet as a collaboration tool. With the help of several fascinating and instructive case studies, Sandred explores practical concerns such as building, motivating, and managing virtual teams; structuring tasks and meeting deadlines; establishing trust; project management software tools; maintaining project security; and more.

Read More Show Less

Editorial Reviews

From the Publisher
"an essential practical guide" (Linux User, September 2001)
Booknews
Identifies the business opportunities open source software provides, and how to manage successful open source projects by using the internet as a collaboration tool. The author explains different business and project management aspects of open source, and presents some open source tools and suitable methods. Annotation c. Book News, Inc., Portland, OR (booknews.com)
Read More Show Less

Product Details

  • ISBN-13: 9780471403968
  • Publisher: Wiley
  • Publication date: 4/1/1901
  • Series: Technology Briefs Series , #8
  • Edition number: 1
  • Pages: 208
  • Product dimensions: 7.52 (w) x 9.22 (h) x 0.51 (d)

Table of Contents

Acknowledgments
Introduction
Ch. 1 An Open Source Primer 1
Ch. 2 Open Source in Business Terms 23
Ch. 3 The Open Source Philosophy 37
Ch. 4 Open Source and the Internet Economy 59
Ch. 5 Network Organizations 73
Ch. 6 Managing a Virtual Team 81
Ch. 7 Managing Distributed Open Source Projects 101
Ch. 8 Tools for Building Open Source Products 133
Ch. 9 Open Source Tool Tips 145
Ch. 10 Setting Up an Open Source Project 155
Ch. 11 Open Source Management Anecdotes 167
Ch. 12 Are You Ready for Open Source? 173
Index 181
Read More Show Less

First Chapter


An Open Source Primer


There is no such thing as a free lunch.



ROBERT A . HEINLEIN , THE MOON IS A HARSH MISTRESS, PUTNAM PUBLISHING GROUP, 1966

Why do all books start with a history lesson? Old habits? Cheap padding? In this case, to understand what this new phenomenon called open source really is, it is necessary to have a thorough background. Frankly, many ideas regarding open source are not new. They originate from a combination of U. S. cold war politics, Flower Power, and a touch of anarchistic and libertarian opinions.

The Birth of the Internet

Without the Internet, open source would not exist. The National Physical Laboratory in Great Britain installed the first experimental network that used the home-brew experimental research predecessor to Internet Protocol (IP) as a communications protocol in 1968. Shortly thereafter, the U. S. Pentagon department Advanced Research Projects Agency (ARPA--later changed to DARPA when the word Defense was added) founded a much larger and more ambitious networking project.

ARPA

ARPA had its origin in the post-World War II government support for computing and later the cold war. One of the earliest roots was a visionary article in Atlantic Monthly by Vannevar Bush, administrative officer in the Eisen-1 hower administration and a prominent MIT researcher, published right after World War II. In the article, he discusses an imaginary collaborative science and computing network, much like the Internet of today.

President Dwight Eisenhower loved the scientific community. He found scientists inspiring. He liked their ideas, their culture, their values, and last but not least their value to the country. Eisenhower surrounded himself with the nation's best scientific minds and was the first president to host a White House dinner specifically to honor scientific and engineering communities. Hundreds of prominent American scientists directly served the Eisenhower administration on various panels.

Under the psychological impact of the Soviet construction of a nuclear weapon defense in the 1950s, the vision of a nationwide network began to take shape as the U. S. government sought to regularize its technological research and spending. The industrial policy was driven by foreign politics-- the cold war--and was to be a fast-response mechanism closely tied to the American President and Secretary of Defense, to ensure that Americans would never again be taken by surprise on the technological frontier. The Soviet launch of the first Sputnik satellite in October 1957 created a period of U. S. national crisis. The public pressure was high on the White House to act. There were several public addresses to reassure the American people and reduce the minor panic the Sputnik launch had caused.

During that time, the U. S. military saw their chances to get higher contributions. The Department of Defense bureaucrats as well as Army, Navy, and Air Force commanders treated Sputnik like the starting point in a new race, opposing each other for the biggest share of governmental research and development spending. The competition sometimes reached absurd heights, but Sputnik launched a golden era for military science and technology.

President Eisenhower's personal experience in the military made him distrustful of the bureaucratic interests in the Pentagon. Therefore new institutions, largely independent of specific military branches, were created. The best known were the National Science Foundation, the National Aeronautics and Space Administration (NASA), and ARPA. The president formed ARPA on January 7, 1958, right after the launch of Sputnik II on November 7, 1957.

A key appointment for the predecessor to the Internet came in 1962 when psychologist J. C. R. Licklider was hired to head a behavioral sciences office at ARPA, an office that would evolve under Licklider's two-year directorship into the Information Processing Techniques Office (IPTO). That office was later the originator of the Internet.

Licklider was far more than just a computer enthusiast. He touted the pioneering vision that computers were more than just adding machines. This was presented in an era of almost hysterical technology optimism and horror of the communist specter. This was also the golden era of Isaac Asimov's novel I Robot, pulp science fiction like Amazing Stories, and TV series like The Twilight Zone. This was an era when the president made front-page headlines with speeches drawing links between science and defense.

Licklider believed that computers had the potential to act as extensions of the human being, as tools that could amplify the range of human intelligence and expand the reach of our analytical powers. In 1960 he wrote a manifesto for using computers to enhance research collaboration entitled Man-Computer Symbiosis. As importantly, Licklider's university background encouraged him to extend ARPA's funding to a range of university projects. One key project was a $3 million per-year grant to encourage the spread of the new invention time-sharing computing (a product of the innovative Whirlwind minicomputer at MIT).

Time-sharing computing is the basis for all modern operating systems, making it possible for more than one person to share the same computer and to run different programs at the same time. It is a technological method to give many users interactive access to a computer from individual terminals. The alternative is batch processing, where each program is run in sequence in a queue.

ARPA would fund six of the first 12 time-sharing computer systems in the United States, which in turn would help found the whole minicomputer industry in the 1960s, crucial to the development of the Internet over the next decades. It was MIT hackers who largely designed both hardware and software for Digital Equipment's (now Compaq) breakthrough PDP-6 and later PDP-10 time-sharing minicomputers. ARPAnet was primarily a network of cheap time-sharing systems from Digital Equipment.

The demand for a failsafe network grew concurrently with the cold war and the American military's increasing dependence on computers. Between 1960 and 1964 Paul Baran, a researcher at RAND Corporation, wrote 11 papers on the idea of a nation-wide distributed communications system. He proposed how the future American military network should be structured. The idea was to build something that could survive a nuclear attack and was not dependent on a central switch.

ARPA was never interested in building a network to protect national security in the face of a nuclear attack. The project instead embodied the most peaceful of intentions: to link computers at scientific laboratories across the United States, so researchers could share computer resources. The intention had nothing to do with supporting or surviving a war. Eventually ARPA decided to use the distributed and revolutionary ideas from Baran's papers to network its various research outlets around the country.

What Baran envisioned was a network of unmanned switches. His approach was to disassemble the central communication switches and compose the network of many small switches, each connected to several of its neighbors-- a fairly straightforward concept.

Baran's second idea was revolutionary: disassemble the messages as well. By dividing each message into message blocks, and making every block contain information about its place in the data stream, you could flood the network with packets taking different paths all over the network, and upon arrival reassemble the message into its original form by a receiving computer.

Baran tried to persuade AT& T to build the system, but they believed that his idea would never work. One should remember that these ideas were presented at a time when all the signals on the telephone network were analog, all switches were central, many manually operated switches were still in use, and you had to order long-distance connection in advance. The telephone network was circuit-switched (or point-to-point), which means that a communication line was reserved for one call at a time and held open for the duration of that session. Baran envisioned a network of unmanned switches or nodes that routed the messages automatically using a "self-learning policy at each node, without the need for a central, and possibly vulnerable, control point."

Interestingly, in a 1965 paper independent of Baran's work, Donald Watts Davies at the British National Physical Laboratory wrote what was essentially Baran's idea, though he called the message blocks packets and the technology packet switching. He also tried to persuade governmental agencies, in this case the British Post Office, but likewise received stiff resistance.

The technical similarity between Davies' and Baran's work was striking. Their ideas were conceptually identical. They had even chosen the same packet size and data-transmission rate. Both had adaptive routing schemes, though different in detail.

Soon ARPA got hold of both Davies' and Baran's work. At the end of 1967, at an Association of Computing Machinery (ACM) conference in Tennessee, the first paper that described the ARPAnet was presented. By July 1968, a request for proposal was sent out from ARPA to 140 companies interested in building the first nodes, called Interface Message Processors.

By the end of 1968, ARPA announced that the contract to build the router had been awarded to Bolt Beranek and Newman (BBN), a small Cambridge-based company made up largely of MIT graduate students and affiliated researchers, including J. C. R. Licklider at various times.

The first computer connected to ARPAnet was a Honeywell (now Unisys) DDP-516. The machine had no hard disk or floppy (the floppy disk hadn't been invented yet), had a core memory of 12 kbyte, and was programmed in assembler by punched paper tape.

The premiere was a year later on October 20, 1969 when the first node in what today is called the Internet, was installed at University of California at Los Angeles (UCLA). The word LOGIN was sent from UCLA to node number two at Stanford Research Institute. In December 1969 four nodes were connected: UCLA, Stanford Research Institute, University of Utah, and University of California at Santa Barbara. What would evolve into the Internet had been born.

First Key Internet Technology

The first network protocol was created in 1971 to allow a user at one computer to connect to other computers on the network as if they were local users. This soon evolved into the standard Transmission Control Protocol (TCP). In 1972 the File Transfer Protocol (FTP) was created, which allowed individual files to be exchanged between computers.

By October 1972, when the ARPAnet was first demonstrated publicly at a conference, there were 29 nodes in the network. At the conference the first ARPAnet user manual was presented, but it would take 22 years before the first book on the Internet was published.

The first international node was installed in Paris in 1972. The ARPAnet was solely used by scientific and military institutes, and only the United States and its allies had access. During this time, two students, Vinton Cerf and Robert Kahn, both original members of the UCLA graduate student group that helped launch the ARPAnet, published the first paper that described the combination of Transmission Control Protocol and the Internet Protocol (TCP/ IP). But it wasn't until 1981 that the entire Internet switched to TCP/ IP as the main protocol.

In 1976, ARPAhired Cerf, now a Stanford professor, and Kahn, now a BBN manager on the project, to create a system for integrating the ARPAnet with other computer networks. By 1977, they demonstrated that the Internet Protocol (IP) could be used to integrate satellite, packet radio, and the ARPAnet. From this point on, new networks of computers could be easily added to the network.

Unix

In 1981, ARPA funded researchers at UC-Berkeley to include TCP/ IP networking protocols into the Unix BSD (Berkeley Software Distribution) operating system. Bill Joy, the lead programmer in the Berkeley Unix effort, created the new version of Unix including the TCP/ IP networking protocol. With a minimal licensing fee, Berkeley seeded its Unix version with its Internet protocols throughout the university world, thereby spreading the Internet standards to computers worldwide.

If anything illustrates both the gains from government support of open standards in computing and the dangers from public policy withdrawing from that support, it is the Unix operating system. The computer industry would not exist on its current scale without support of the government institution ARPA and the U. S. Department of Defense. Governmental funding has directly and indirectly nurtured many high-tech companies, projects, and products, including Unix, the Internet, and free software. The Department of Defense became interested in the Unix operating system because of research ties between the defense establishment and many top-level universities, such as MIT and UC-Berkeley, where ARPA subsidized the development work. Sure, the defense bureaucracy had and still has its own issues, but it is not monolithic. Different groups have different needs and systems. All of them have enormous amounts of taxpayers' money invested in incompatible computer systems. My own country Sweden and the European Union are no exceptions.

Unix was created at AT& T research laboratory Bell Labs in the late 1960s. It was the first operating system that was independent of specific hardware. Unix could be ported to different machines, thereby allowing the same program to run on completely different hardware. The source code was widely licensed by AT& T, and in the beginning, mostly to universities.

Unix was especially popular with ARPAnet programmers working on a wide variety of computers, because they needed to create a portable integrated set of software tools for managing the network.

The openness of Unix--it was distributed with the source code--made it possible for anyone to change the system to fit his or her own preferences. Although it was a great benefit for technological development, it was also an administrative disadvantage. During the 1970s Unix developed into a number of variations (including Unix BSD).

Early in the Reagan administration, the military had carte blanche to buy computers, but toward the end, government agencies required lower costs. One way was to bring standardization and compatibility to the prevailing computing chaos.

The key for making Unix nearly universal in corporate and high-end computing in the late 1980s was the decisive action by the U. S. federal government in support of strong Unix standards. The federal government itself was faced with a mess of different computer systems that needed to be networked together.

Unix was growing in popularity and was available on many high-performance systems. Because of ARPA's long-time study and support for Unix, the operating system was uniquely suited to run the ARPAnet. Because of the close ties of the Department of Defense and university researchers (largely fostered by ARPA), the federal government already had an affection for Unix.

The U. S. government looked to find an operating system to which all bidders for government business would be required to conform. The European Union agreed with this concept. In 1986 any computer company that bid on government contracts, U. S. or European, had to offer Unix as the operating system at least as an option for the bid to be considered.

Sun Microsystems: The Network Is the Computer

No single private company benefited more from, and contributed more to, the Unix and Internet standards than Sun Microsystems. The company started in 1982 by selling high-performance Unix-based workstations and servers. Every Sun computer was shipped with Unix, with hardware and software designed to be hooked up to the Internet. Much of the Internet was networked on Sun Unix machines in the 1980s. Sun even copyrighted the phrase "The network is the computer."

Sun dominated the market for workstations that replaced time-sharing mini-computers in the 1980s. Sun was now one of the fastest growing companies in history, making the Fortune 500 list within five years.

The consistent focus on Unix and networking gave Sun a huge advantage in securing one of the large slices of the $500 million, five-year National Security Agency contract then under bid. Sun's and AT& T's version of Unix became the benchmark for selling to the government and university markets, along with many private industry customers who would follow the government's lead in standards.

Open Systems or open standards were terms coined in the computer industry in the 1980s, mostly due to Sun Microsystems (even if Hewlett-Packard invented the term). In a computer context open systems and closed systems mean, respectively, non-proprietary and proprietary.

Sun's commitment to open standards reflected the company founders' emergence out of the milieu of the ARPAnet. When Stanford students Scott McNealy and Vinod Khlosa teamed up with Andy Bechtolsheim, who had developed a new high-performance computer using off-the-shelf components, it was fitting for them to adopt Unix as the operating system for their new computer, Sun 1 (as in Stanford University Network). But the software and networking standards were missing. Therefore it was natural to bring in as a cofounder Bill Joy, the premiere Unix and ARPAnet programmer at UC-Berkeley.

Commercial versions of Unix were split between various incompatible proprietary versions. Far from being a widely used standard in business, the Sun team had to design a standard and sell the message of open computing to private industry.

They took a number of steps to ensure that the BSD Unix on Sun's computers was seen as a viable standard. Sun gave away the BSD Unix and TCP/ IP networking software with every computer they sold, under the Berkeley Software Distribution (BSD) license. The BSD license is in the open source family of licenses and only requires that the copyright holder be referenced in unlimited changes to code (open source licenses are discussed in more detail in Chapter 2, "Open Source in Business Terms").

When Sun developed the Network File System (NFS) in 1984, which enhanced network computing by making it possible to share files between different computers, they didn't try to sell it as normal standalone software. Instead, they licensed it to the industry for a nominal licensing fee; but this time the code was not under any open source agreement. However, they published the specifications for the software on the Usenet so that anyone could design an alternative to the NFS file system if they wanted to avoid the license fee. The specification was open, but the software was not.

Another key step toward a universal operating systems standard was made in 1985. Sun approached AT& T and worked out an agreement to merge Sun's Berkeley Unix with AT& T's System V, further enhancing the public view of Sun's Unix as the standard. The convergence effort was an attempt to blend the best of both variants to come up with a unified system, the Unix System V.

That was the trigger for other workstation and corporate computer makers to do a complete turnaround. In 1987 and 1988 every company in the IT industry began promoting their own open computing Unix systems--all with the built-in Internet protocols that would set the stage for the commercial explosion of the Internet in the 1990s.

The Shared Ethic

Concurrently with the de-escalation of the cold war, U. S. authorities began looking for someone who could assume the responsibility of running the ARPAnet. During the late 1970s there were discussions about the possibility of selling the network to a commercial company.

At the same time, more and more universities were hooking up to the ARPAnet, now called the Internet. They opposed the commercialization of the network. Also, NSF formally had the responsibility for the operations and were not allowed to run commercial traffic.

The sharing of software was a key part of the success of the Internet. At the universities, paid staff and volunteers both received and provided a continuous stream of free software and constantly improved its functionality. The Internet itself spread across the network nearly instantaneously, without any of the distribution costs of any new software innovation. This "gift" economy allowed new innovations to be quickly tested and to gain a critical mass of users for functions that had not even been envisioned by the creators of the system. This is the key mechanism and the very foundation of open source. Without it open source wouldn't exist.

More and more companies realized the value of the Internet, as it had become an international resource of universities and research centers, and gladly hooked up to the net. Commercial applications that were not necessarily for scientific use began appearing.

By this time military participation on the Net had become marginal. In 1990 the military researchers pulled out and formed their own research network, and the Internet became exclusively a network of universities and civil companies.

The ethic of shared software was called the hacker ethic at MIT, especially in the Artificial Intelligence Lab. From a business perspective, free software and open source are essentially about sharing resources and thereby enhancing the development process. The idea of sharing is the very core of the culture of the Internet. Sharing of software is as natural as sharing of recipes.

Neither the term freeware nor open source existed in the early 1970s, but it was essentially the way software was treated. Anyone from another university or company was freely allowed to port and use any program. Source code was freely distributed, so that anyone could read, change, or take useful parts of it to make a new program.

Ultimately, science is an open source enterprise. The scientific method rests on a process of discovery and justification. For scientific results to be justified they must be replicable, and that is not possible unless the source information is shared: the hypothesis, the test conditions, and the results. The discovery of new inventions can sometimes happen as an isolated occurrence, but for the results to be credible other scientists must examine the results and repeat the tests to validate the results. Science goes forward only if scientists are able to fertilize each other's ideas.

Much early freely-distributed software was games like Decwar and Adventure, but soon more serious software spread the "open" way. The first shared killer application was email.

Email

The earliest email between two computers was sent in 1971. Email as such had existed for a long time, but only between users at the same computer. Not planned as part of its design, email was created as a private hack by BBN engineer Ray Tomlinson in 1972, as a piggyback on FTP.

File Transfer Protocol (FTP) specifies the formatting for files transferred over a network. It was the first application to permit two computers to communicate with each other. Tomlinson wrote a program that could carry a message from one machine and drop it in a file on another using the FTP as a carrier. The problem, though, was to separate the machine user from the machine itself. He needed a punctuation mark and chose the "at" sign (@) on his Model 33 Teletype terminal.

Under the tolerant supervision of ARPA, use of the network for email communication soon surpassed computing resource sharing. Stephen Lukasik, ARPA director from 1971 to 1975, saw the importance of email for long-distance collaboration. He soon began virtually directing ARPA via electronic mail.

Eric Allman, a student at UC-Berkeley, created the program Sendmail to assist network managers in directing and processing the increasing email traffic. Sendmail is still used to direct over three-quarters of Internet email traffic.

Xanadu

The next killer application came from a European research institute. In 1992 Tim Berners-Lee, a programmer at the European nuclear research institute CERN, developed a system based on the concept of hyperlinks, developed by Ted Nelson at Xanadu.

Today, Nelson is Visiting Professor of Environmental Information at the Keio University in Fujisawa, Japan; his discipline is sociology. A self-proclaimed Designer, Generalist, and Contrarian, Nelson's first book was published in 1965; it introduced the concept of hypertext and hypermedia. He is also the man behind compound document (an electronic document that embeds various media, such as pixel graphics, vector graphics, video, sound, and so forth), virtuality, and micro payment. But his best known work is the almost mythological Xanadu project, introduced in 1967. He touted the pioneering vision that computers were media machines, not just calculators. Now, 27 years later, the source code is available at www. udanax. com with an open source license.

Ted Nelson is an anonymous public figure. None of his ideas or visions are widely known, but his importance in the development of the modern IT industry cannot be overestimated. Nelson today directly influences much of the essential software. The most well known are Ray Ozzie's IBM Lotus Notes, Bill Atkinson's Apple HyperCard, and Marc Andreessen's Mosaic, where the designers expressly have acknowledged the deep impression Xanadu had on their products. Berners-Lee wasn't aware of Xanadu when he got his original idea to create the World Wide Web, but he references Nelson in his original proposal.

Xanadu is similar to the World Wide Web, but with built-in mechanisms to manage notes, annotations, revisions, copyrights, and micro payments. Nelson wanted to replace paper with a literary machine that would permit documents to change and track how, and by whom, changes are made. The important difference between information on paper and electronic information in the Internet is that the latter is dynamic. The electronic document is a set of links that is not created until the user accesses the information.

In Xanadu (or Udanax, as the software is called for copyright reasons) all text is mapped in a linear address space. Parallel with the text is a data structure that specifies format and links in the text. This has the advantage of keeping the content uncluttered. In HTML, information about dependencies between different documents is imbedded in the content. Xanadu separates the content from the layout.

In Xanadu all links are two-way, that is, links contain both go-to and come-from information. You not only know where a link leads, but also its origin. The file system in Xanadu is called Ent (after the treelike creatures in Lord of the Rings) and written in Smalltalk and C++.

Berners-Lee needed an electronic system that allowed scientists to easily reference each other's paper in footnotes, without the hassle of searching through many documents. He created the originally text-based HyperText Markup Language (HTML) protocol in 1992. Just for fun and to annoy his French colleagues (really!), he baptized it the World Wide Web, which is hard to pronounce with a French accent. It was created on a Next workstation.

It was with the World Wide Web that the Internet became an international top-level affair. In 1994 the first formal email between two heads of States was sent between the then Swedish Prime Minister Carl Bildt and President Bill Clinton.

Mosaic from NCSA

The World Wide Web eventually found its way to another research institute: the National Center for Supercomputing Applications (NCSA). The initial Web browser, Mosaic, was created at the University of Illinois at Champaign-Urbana where NCSA was located. NCSA's 40-member software development group made high-performance information-sharing and collaboration software, but they also had in-depth experience in networking. In 1985 NCSA created Telnet, a software client that allowed people to access and use computers connected to the Internet as if the user were locally based. NCSA had for some time worked to create a graphics-based collaborative tool for sharing documents called Collage, so it was natural for them to create a team to develop a graphics-based version of the HTML protocol created by CERN.

The result was Mosaic, first introduced on the Unix platform in January 1993, with Macintosh and PC versions introduced in August 1993. Copyrighted by the University of Illinois, Mosaic could be downloaded for free by individuals and by companies wishing to use the Internet for internal communications.

However, NCSA did not want to become a help desk for commercial applications. In August 1994, the University of Illinois assigned all future commercial rights of NCSA Mosaic to Spyglass, a local company created by NCSA alumni to commercialize NCSA technology. The goal was for university researchers to continue developing long-term technology and standards to be incorporated into browsers. Spyglass was responsible for licensing the technology to companies as well as for software support.

Workstation and server manufacturer Silicon Graphics CEO, Jim Clark, a veteran from the Unix standards wars, understood the importance of Mosaic and the need to take control of the standardization of this new Internet tool. Clark left his company and met with Marc Andreessen, a member of the Mosaic team.

Netscape Communications was born out of that meeting in April 1994. Clark put up the capital and Andreessen recruited five other Mosaic team members from NCSA to design what they called Mozilla, the Mosaic-Killer. The team was working at a frantic pace to create a beast vastly more powerful than Mosaic. The word became the official code name for Navigator. Later the big green dinosaur became an inside joke, then a company mascot, and finally a public symbol.

Clark did what Sun had done in the 1980s: He created a new standard that the company controlled. But unlike Sun, which rode public Unix standards to rapid growth, Netscape began its life with a direct assault on the original government-based standards created by the NCSA.

Netscape included the ability to display text formatting that did not even exist in the HTML standards embedded in the NCSA Mosaic browser. This meant that Web pages designed to work with Netscape would not be read-able by all the other Mosaic-based browsers.

Netscape gave away the client (though not under the open source license) and charged for the server software. This would encourage people to use Netscape browsers and would encourage Web designers to pay Netscape for the server software that developed Web pages using their modified standards.

Many companies ignored or weren't aware of what was happening. The most prominent example was Microsoft, which in 1996 with a great hullabaloo launched Microsoft Network, a proprietary network (or more exactly, a CompuServe bulletin board competitor) that was not connected to the Internet.

That same year Bill Gates was surfing for the first time and realized that he had been dead wrong. Microsoft's corporate strategy change is one of the most dramatic shifts in business direction in history. Microsoft Network was completely redesigned to be the Web site MSN. com, and the company developed the browser Internet Explorer free of charge and certainly not under any open source license, in just 18 months.

So in the midst of the Unix war, browser war, and commercial competition, the emergence of open source software came as a surprise. The catalyst was Microsoft, or rather the reaction of Microsoft's monopolistic practices. It is considered, especially by the hacker community, Microsoft's competitors, the Federal Trade Commission, and Attorney General Janet Reno, that Microsoft used a combination of its early alliance with IBM and hardball tactics to build its proprietary operating system monopoly on the desktop. From that base, it is argued that Microsoft extended its proprietary standards into the market for large-scale business computing, formerly the province of mainframes and Unix-based network servers.

While the Internet at first appeared as a danger to Microsoft, the company also saw that success in molding those standards in a proprietary direction could extend the company's control throughout the whole world of corporate computing--again the same tactics that Sun and Netscape used earlier.

Microsoft responded with a combination of in-house software applications and development tools optimized for its proprietary standards, creating an all-pervasive computing environment that promised a one-stop-shop for any corporation. Windows and Microsoft Office might be less innovative than any particular competitor, but the company's very completeness across all sectors of computing would make up for its rigidity. That was seen as an advantage compared to the jungle of Unix standards. The IT industry had divided into a bunch of Unix camps and left customers uncertain that their needs would be met in the fragmented Unix environment. The Microsoft Windows NT solutions on standard PC hardware were also much cheaper than most Unix solutions, which in the end didn't hurt the buying process.

By the beginning of the 1990s the Unix business was competing dangerously against itself. The Unix promise of cross-platform portability got lost in the competition between a dozen various Unix versions. Therefore no one saw Microsoft munching away the Unix market from down under with Windows NT. Microsoft was able to grab a large part of the server market before the Unix vendors even realized what was happening.

By 1997 Microsoft Windows NT Servers were outselling Unix servers. It was clear that in the absence of strong standards and government support for such standards, proprietary models had a clear advantage in yielding market stability, which Microsoft was fast to exploit. Commercial powers controlled software research and development.

Free Software Foundation

In January 1984 one of the original MIT AI Labs hackers, Richard M. Stallman, quit his job at MIT and founded Free Software Foundation. He objected to the increasing commercialization of university software research. Stallman feared that despite the fact that popular Unix standards like Sun's were broadly distributed, they still remained under private ownership and would be used for proprietary advantage, which is what happened by the early 1990s.

Stallman's goal was to develop software for anyone to use at no cost, thereby implicitly helping software research. He started by designing a Unix-compatible operating system, GNU. The name GNU was chosen following a hacker tradition, as a recursive acronym for GNU's Not Unix. He chose to make the system compatible with Unix so that it would be portable, and so that Unix users could easily switch to it. The community of GNU programmers and users sought a nonproprietary Unix alternative to escape the Unix standards wars between competing commercial providers. For various reasons, the GNU operating system was delayed, and the work started with a C compiler and the editor GNU Emacs. Also, commercial Unix systems were still expensive, and no one got the source code anyway.

In the underground the hacker community was working with the GNU development tools the same way as in the 1960s and the 1970s, with cheap hardware.

Linux

The history of Linux started the summer of 1990 when Linus Torvalds, a student of technology at the University of Helsingfors in Finland, started hacking on an embryo to an Intel 386 Unix system as a hobby project, frustrated by the expensive and bad alternatives. As Torvalds humbly noted, it was only a personal hobby project, "nothing big and professional."

After a few months he had successfully written a working kernel. Although there was still much to be done, the project drew the attention of curious programmers when Torvalds announced it to a newsgroup on the Internet. In October 1991 he released the source code. Of the first ten people to download Linux, five sent back bug fixes, code improvements, and new features. It was the beginning of what became a global hack, involving millions of lines of code contributed by thousands of programmers. It was the beginning of one of the most spectacular software developments ever seen.

Torvalds' early inspiration was a Unix clone called Minix. Professor Andrew Tanenbaum at Vrije Universitat in Amsterdam, Holland, wrote it for teaching purposes as an enclosure to his textbook Operating Systems: Design and Implementation (Prentice-Hall, New Jersey, 1987). It was never intended as a commercial system, and he had sacrificed performance for the clearness of code and some essential features of standard Unix systems. Furthermore it was copyrighted to the author, an act that forbid unauthorized use or modification of the source code without his permission. Nonetheless, its size--small enough to run on a PC--and the availability of source code at no cost struck a chord among students as well as general users. Within two months of its release in 1987 there was a newsgroup, comp. os. minix, with over 40,000 users worldwide.

The first release version 0.01 of Linux contained nothing more than a rudimentary Unix kernel, and it was still hosted by Minix (the kernel could not run without Minix). Moreover, even as Torvalds announced his project to the comp. os. minix newsgroup, he did not intend to make his system portable or, for that matter, as complete as a commercial Unix kernel:

Hello everybody out there using Minix--I'm doing a (free) operating system (just a hobby, won't be big and professional like gnu) for 386( 486) AT clones. This has been brewing since april, and is starting to get ready. I'd like any feedback on things people like/ dislike in minix, as my OS resembles it somewhat (same physical layout of the file-system (due to practical reasons) among other things) [...] I'd like to know what features most people would want. Any suggestions are welcome, but I won't promise I'll implement them :-) Linus (torvalds@ kruuna. helsinki. fi).

PS. Yes -it's free of any minix code, and it has a multi-threaded fs. It is NOT portable (uses 386 task switching etc), and it probably never will support anything other than AT-hard disks, as that's all I have :-(

Posted at comp.os.minix August 25, 1991



By October 5, 1991, version 0.02 was able to run the bash shell that provided an interface for sending commands to the kernel, as well as gcc, the GNU C compiler. The source code was also released at this point:

Do you pine for the nice days of Minix-1.1, when men were men and wrote their own device drivers? Are you without a nice project and just dying to cut your teeth on an OS you can try to modify for your needs? Are you finding it frustrating when everything works on Minix? No more all-nighters to get a nifty program working? Then this post might be just for you.

As I mentioned a month ago, I'm working on a free version of a Minix-look-alike for AT-386 computers. It has finally reached the stage where it's even usable (though may not be, depending on what you want), and I am willing to put out the sources for wider distribution [...] This is a program for hackers by a hacker. I've enjoyed doing it, and somebody might enjoy looking at it and even modifying it for their own needs. It is still small enough to understand, use and modify, and I'm looking forward to any comments you might have. I'm also interested in hearing from anybody who has written any of the utilities/library functions for minix. If your efforts are freely distributable (under copyright or even public domain), I'd like to hear from you, so I can add them to the system.

Posted at comp.os.minix October 5, 1991



By the end of the year, when Linux finally became a stand-alone system in version 0.11, more than a hundred people worldwide had joined the Linux newsgroup and mailing list.

The immediate interest was due to the fact that the entire source code was available for free download to anyone who was interested in using and modifying the system. By releasing the source code for free at a very early stage and also updating the releases often, Torvalds quickly found help, support, and feedback from other programmers. Even as the first official version was released in 1994, changes were being made on a daily and weekly basis while Linux continued to mature into a powerful and versatile operating system.

After slightly more than three years, version 1.0 was released in the spring of 1994. It was stable in parity with most commercial Unix systems.

The development of Linux continued at an accelerated pace even after the release of version 1.0, the first official Linux, in 1994. The latest kernel (August 2000) contains more than 3 million lines of code.

Almost a decade after its beginning, the success of Linux is indeed a surprise. Not only has Linux far surpassed the popularity of commercial Unix systems,many also regard Linux as the best brand of Unix. Linux represents the philosophy of Unix--simplicity, portability, and openness. It is the most widely ported system available today, and the only system gaining market share besides Microsoft Windows NT.

A small development team did not develop Linux in the traditional way of both commercial software development and freeware like GNU. Linux was developed by a huge number of volunteers coordinated through the Internet by a project leader. Quality was maintained by the extremely simple technique of releasing frequently and getting feedback from many users.

Today Linux runs on embedded systems to super computers, but originally Linux was targeted toward only one architecture: the Intel 386. The basics were the same as in the 1970s: low-cost development and cheap hardware. But now the exploding Internet--cheap networking--is a new and very important factor.

Linux is neither the first nor the only open source software. Nonetheless, it deserves a special place within the history of open source software. Most important is the size of the Linux project. It is simply unique in the history of software development. The project has involved over 40,000 developers worldwide. Today Linux has an installation base of more than 3 million users worldwide, a figure that took Microsoft twice as long to reach. All the leading hardware and software companies are supporting the system.

Among the very first to note the evolutionary dynamics of the Linux project was Eric Raymond, a long-time hacker of free software. His interest in the Linux project stems from his own amazement at its unconventional development model when it caught his attention in 1993. As he recalls in his essay "The Cathedral and the Bazaar":

"[the success of] Linux overturned much of what I thought I knew [about software engineering]."

In May 1997, Eric Raymond gave the first speech, "The Cathedral and the Bazaar," at the Linux Congress in Bavaria. It was a milestone in open source. He published the speech, with minor edits, in February 1998 (he changed "free software" to "open source"), and it was finally published as a book in 1999 (O'Reilly & Associates, 1999). Since then, Raymond has studied the dynamics of the Linux open source project. His analysis, summarized in a series of much celebrated essays, is the result of several years of extensive participant observation and his own experimentation with open source projects.

He was like Richard Stallman, concerned about the vulgar commercialization and monopolistic path the software industry had taken. At the conference and soon thereafter he gathered a group of prominent people in the free software community. He wanted to find a way to promote the free software ideas (essentially because he wasn't happy with Microsoft), but he was also concerned that the alternative Free Software Foundation's strong political antibusiness message was keeping the world at large from really appreciating the power of free software.

The group formed itself as the Open Source Initiative and largely devised a marketing campaign to explain the concept. They coined the term open source. A series of guidelines was crafted to describe software that qualified as open source.

Cathedrals and Bazaars

In January 1998, Netscape unveiled the source code and announced its support for Linux, largely inspired by "The Cathedral and the Bazaar." With Microsoft's proprietary approaches to Internet fast gaining ground, Netscape and soon other actors reluctantly saw their alternative commercial and proprietary standards quickly losing ground. They saw the global open source software model as an opportunity for survival. They would forgo some profits in order to maintain innovation as the priority that could give them an advantage in technology development.

Microsoft had targeted Netscape for destruction. As we will see, the methods were considered dubious and later led to an antitrust lawsuit. For Netscape the issue was less about browser-related income than about changing the playing field for the much more valuable server business.

The decision of Netscape to give away the browser under the Mozilla Organization and the Mozilla Public License shouldn't have come as a surprise, but the release of the source code really stunned the industry. It hit the headlines of leading business newspapers around the world; even the open source community was surprised at the move. Never before had a major software company opened its proprietary code. The body of Communicator source code at Netscape was called Mozilla. Now the name came into use as the generic term referring to the open source Web browsers derived from the source code of Netscape Navigator.

It is interesting to note that Netscape was beaten by their own original strategy, to bend the protocols away into proprietary channels. Mozilla was once targeted toward Mosaic. Now Microsoft Internet Explorer was beating Netscape Communicator with the same tactics.

After months of internal discussion at Netscape about whether to release the binary for free, critical mass was reached in the decision to free the source in an unbelievably fast 24 hours, a move without parallel in the industry. As a journalist with 15 years' experience in the IT industry, I have never experienced such a radical and revolutionary move by any commercial company. It took Microsoft 18 months to change its business strategy to be based on the Internet--and that is considered very fast.

Netscape had a lot to do to make the source code ready for publicity. One of the largest issues was how to handle all the third-party modules included in the browser. Netscape Communicator contained over 75 third-party modules in its source, and all of the code owners needed to be approached. Each third-party company had a choice of either removing or replacing their code or shipping it as open source code along with Mozilla. To complicate matters, many of the third-party contracts were unique and ran for different lengths of time. And not only did the inclusion or exclusion of each third-party code have to be resolved, all comments had to be taken out of the code as well.

Parallel to the code cleanup was the license effort. A group of open source celebrities, including Linus Torvalds and Eric Raymond, was invited. The team scrutinized the GNU General Public License, the GNU Library General Public License (LGPL), and the Berkeley Software Distribution license. After a month of research and discussion at meetings, the team decided that a completely new license had to be crafted for this unique situation.

The team came up with the Netscape Public License (NPL), a compromise between promoting free source development by commercial enterprises and protecting free source developers. The license itself was developed according to the principle of open source peer review.

When the first draft of the NPL was complete, it was beta-tested publicly according to the same principle. On March 5, 1998, a draft was posted in a new newsgroup called netscape. public. mozilla. license, with a request for comment. Parts of the draft got rave criticism.

On March 21, 1998, a revision was posted. The reaction was perplexed: "I told them it was awful and they listened! I can't believe it!"

No one expected that a big commercial company would make this move. First the move to open source, then to really listen to users. The open source community realized that this was a true open source project as the discussions guided the process, rather than providing commentary on its results. The result was the release of the Mozilla Public License (MozPL).

It was decided that all of the source code should be released under the first NPL and all modifications to that code must be released under the NPL. New code on the other hand should be released under the revised MozPL or any other compatible license. New files that do not contain any of the original code or subsequent modified code are not considered modifications and are released under the MozPL or any other compatible license.

All Netscape open source projects were placed in Mozilla. org. The goal was to act as the coordinator for the software, like Linus Torvalds' veto role in Linux, that is, to decide what code is accepted and what is not.

The separate company Netscape Product Development's purpose was to ship Netscape products based on the Mozilla code.

On March 31, 1998, the Navigator source code was released as Mozilla. Within hours, fixes and enhancements began pouring in off the Net. That was the starting point for an avalanche of other announcements from other commercial companies adopting the open source movement.

Others Follow Netscape's Example

On May11, 1998, Corel Corporation announced a plan to port WordPerfect and its other office software to Linux. On May 28, 1998, Sun Microsystems and Adaptec joined Linux International--the first two large established OS and hardware vendors to do so.

The real breakthrough for open source software came on June 22, 1998, when IBM announced that it would sell and support the open source Webserver Apache as part of its WebSphere suite. On August 10, 1998, Sun Microsystems, clearly feeling the pressure from open source, made its Unix operating system Solaris available under a free license to individual users, and to educational, nonprofit, and research institutions.

The most popular Web server on the Internet is neither Netscape nor Microsoft's Internet Information Server, but rather a free, open source server called Apache.

After NCSA developed its Mosaic browser software and its original server software, the institute, as part of government privatization, ceased to update its software. Instead, a disillusioned group of hackers, some at universities and some in private business, began collaborating in 1995 to privately update the NCSA server in the Apache project (as in a software patch). Most of the programmers participated for the fun of it, others for political reasons to protest against the commercialization of the Net, and still others because they were dependent on the software but couldn't afford a commercial server.

The result was overwhelming. (After all, it shouldn't come as a surprise. The software is free of charge as long as you manage the installation, service, and support yourself.) Apache is used in 44 percent of Internet sites, compared to 16 percent that uses Microsoft IIS and 12 percent using Netscape's Server. And the list of sites using Apache includes McDonalds, Yahoo, CBS, the FBI, and IBM. The latter passed over its own Lotus Domino server in favor of Apache when it put its Big Blue vs. Gary Kasparov chess match on the Internet.

In the last week of October 1998, a confidential Microsoft memorandum (www. opensource. org/ halloween) on Redmond's strategy against Linux and open source software was leaked to Open Source Initiative. The memo was formulated in a rather hostile style.

Eric Raymond of Open Source Initiative annotated the memorandum with explanation and commentary over Halloween weekend and released it to the national press as the Halloween Document. The memorandum applied to open source software in general and assessed the potential destructive power of open source software, suggesting means by which Microsoft could combat this threat.

The memo made headlines. The first document contained references to a second memorandum specifically on Linux and Apache. Within days, copies of the second memo were made public. It made even more headlines and started a week-long furor in the media.

The memos were originally distributed within Microsoft on August 11, 1998. It is important to note that these memos never were an official statement by Microsoft, but were intended to stimulate an internal discussion on the open source model and the operating system industry. It is also important to note that these memos represent an engineer's individual assessment of the market at one point in time.

However, the press took the fierce tone in the document as a grant for Microsoft's plans for dirty tricks against Linux and other open source projects. The hacker community took it as proof of the evilness of the capitalist software industry. Also, the timing was bad--Microsoft had just been prosecuted in the antitrust trial.

Microsoft first ignored the documents, then denied their existence. The huge press coverage forced Microsoft to acknowledge their authenticity but officials said the documents were confidential company information that was unauthorized or unintentionally released. Finally, they ate humble pie and Ed Muth, Enterprise Marketing Group Manager, Microsoft, commented on the content:

Linux is a competitor on the client and the server. My analysis is that Linux is a material competitor in the lower-performance end of the general purpose server industry and the small to medium-sized ISP industry. It is important to recognize that Linux, beyond competing with Microsoft, is also, and perhaps even more frequently, an alternative or competitor to other versions of UNIX.

The operating system industry is characterized today by vigorous competition. This competition, of which Linux is only a part, exists at the technology level as well as in terms of business models, applications, channels and alliances.

To better serve customers, Microsoft needs to innovate above standard protocols. By innovating above the base protocol, we are able to deliver advanced functionality to users. An example of this is adding transactional support for DTC over HTTP. This would be a value-add and would in no way break the standard or undermine the concept of standards, of which Microsoft is a significant supporter. Yet it would allow us to solve a class of problems in value chain integration for our Web-based customers that are not solved by any public standard today. Microsoft recognizes that customers are not served by implementations that are different without adding value; we therefore support standards as the foundation on which further innovation can be based.

www.microsoft.com/NTServer/nts/news/mwarv/linuxresp.asp



Today, open source is widely accepted as a professional business model. It is considered a valid alternative to the Windows NT Server. Oracle, IBM, Cisco, Hewlett-Packard, Intel, Sun, and all the other major providers of enterprise solutions have embraced Linux or ported their software to Linux.

Summary

The history of the Internet is based on the culture of the free sharing of information. This influences the business model of open source we now see emerging and transforming the whole industry. Academic research is built on cooperation, not competition. Open source is also built on cooperation.

The sharing of software was a key part of the success of the Internet. Both paid staff and volunteers at the universities got and provided free software, and constantly improved its functionality, without any distribution costs. This gift economy allowed new innovations to be quickly tested and improved.

This is the key mechanism and the very foundation of open source, but it conflicts with the economical interest of the software and media industry. A large part of the modern industry lies in protecting intellectual property through patents, copyright, and trademarks. Open source fundamentally changes this paradigm.

Read More Show Less

Customer Reviews

Be the first to write a review
( 0 )
Rating Distribution

5 Star

(0)

4 Star

(0)

3 Star

(0)

2 Star

(0)

1 Star

(0)

Your Rating:

Your Name: Create a Pen Name or

Barnes & Noble.com Review Rules

Our reader reviews allow you to share your comments on titles you liked, or didn't, with others. By submitting an online review, you are representing to Barnes & Noble.com that all information contained in your review is original and accurate in all respects, and that the submission of such content by you and the posting of such content by Barnes & Noble.com does not and will not violate the rights of any third party. Please follow the rules below to help ensure that your review can be posted.

Reviews by Our Customers Under the Age of 13

We highly value and respect everyone's opinion concerning the titles we offer. However, we cannot allow persons under the age of 13 to have accounts at BN.com or to post customer reviews. Please see our Terms of Use for more details.

What to exclude from your review:

Please do not write about reviews, commentary, or information posted on the product page. If you see any errors in the information on the product page, please send us an email.

Reviews should not contain any of the following:

  • - HTML tags, profanity, obscenities, vulgarities, or comments that defame anyone
  • - Time-sensitive information such as tour dates, signings, lectures, etc.
  • - Single-word reviews. Other people will read your review to discover why you liked or didn't like the title. Be descriptive.
  • - Comments focusing on the author or that may ruin the ending for others
  • - Phone numbers, addresses, URLs
  • - Pricing and availability information or alternative ordering information
  • - Advertisements or commercial solicitation

Reminder:

  • - By submitting a review, you grant to Barnes & Noble.com and its sublicensees the royalty-free, perpetual, irrevocable right and license to use the review in accordance with the Barnes & Noble.com Terms of Use.
  • - Barnes & Noble.com reserves the right not to post any review -- particularly those that do not follow the terms and conditions of these Rules. Barnes & Noble.com also reserves the right to remove any review at any time without notice.
  • - See Terms of Use for other conditions and disclaimers.
Search for Products You'd Like to Recommend

Recommend other products that relate to your review. Just search for them below and share!

Create a Pen Name

Your Pen Name is your unique identity on BN.com. It will appear on the reviews you write and other website activities. Your Pen Name cannot be edited, changed or deleted once submitted.

 
Your Pen Name can be any combination of alphanumeric characters (plus - and _), and must be at least two characters long.

Continue Anonymously

    If you find inappropriate content, please report it to Barnes & Noble
    Why is this product inappropriate?
    Comments (optional)