Long Boom: A Vision for the Coming Age of Prosperity

Overview

"The Long Boom argues that we are on the verge of an unprecedented global economic expansion, and that the choices we make today - as informed individuals, institutions, communities, and nations - will determine whether its full promise is realized. Analyzing economic, political, technological, and socio-cultural trends that began to converge in the early 1980s, the authors offer a vision of how the next twenty years will unfold."--BOOK JACKET.
Read More Show Less
... See more details below
Available through our Marketplace sellers.
Other sellers (Hardcover)
  • All (49) from $1.99   
  • New (3) from $2.00   
  • Used (46) from $1.99   
Close
Sort by
Page 1 of 1
Showing All
Note: Marketplace items are not eligible for any BN.com coupons and promotions
$2.00
Seller since 2010

Feedback rating:

(14)

Condition:

New — never opened or used in original packaging.

Like New — packaging may have been opened. A "Like New" item is suitable to give as a gift.

Very Good — may have minor signs of wear on packaging but item works perfectly and has no damage.

Good — item is in good condition but packaging may have signs of shelf wear/aging or torn packaging. All specific defects should be noted in the Comments section associated with each item.

Acceptable — item is in working order but may show signs of wear such as scratches or torn packaging. All specific defects should be noted in the Comments section associated with each item.

Used — An item that has been opened and may show signs of wear. All specific defects should be noted in the Comments section associated with each item.

Refurbished — A used item that has been renewed or updated and verified to be in proper working condition. Not necessarily completed by the original manufacturer.

New
1999 Hard cover New in new dust jacket. Sewn binding. Cloth over boards. With dust jacket. 336 p. Audience: General/trade.

Ships from: Greenville, AL

Usually ships in 1-2 business days

  • Canadian
  • International
  • Standard, 48 States
  • Standard (AK, HI)
  • Express, 48 States
  • Express (AK, HI)
$2.49
Seller since 2010

Feedback rating:

(1797)

Condition: New
0738200743 Has stickers on cover. Otherwise in new condition. We are a tested and proven company with over 900,000 satisfied customers since 1997. Choose expedited shipping (if ... available) for much faster delivery. Delivery confirmation on all US orders. Read more Show Less

Ships from: Nashua, NH

Usually ships in 1-2 business days

  • Canadian
  • International
  • Standard, 48 States
  • Standard (AK, HI)
  • Express, 48 States
  • Express (AK, HI)
$44.17
Seller since 2009

Feedback rating:

(8)

Condition: New
1999 Hardcover Brand new books, maps and cd's available immediately from a reputable and well rated UK bookseller; despatched promptly and reliably worldwide. *****PLEASE NOTE: ... This item is shipping from an authorized seller in Europe. In the event that a return is necessary, you will be able to return your item within the US. To learn more about our European sellers and policies see the BookQuest FAQ section***** Read more Show Less

Ships from: Welwyn Garden City, United Kingdom

Usually ships in 1-2 business days

  • Canadian
  • International
  • Standard, 48 States
  • Standard (AK, HI)
Page 1 of 1
Showing All
Close
Sort by
Sending request ...

Overview

"The Long Boom argues that we are on the verge of an unprecedented global economic expansion, and that the choices we make today - as informed individuals, institutions, communities, and nations - will determine whether its full promise is realized. Analyzing economic, political, technological, and socio-cultural trends that began to converge in the early 1980s, the authors offer a vision of how the next twenty years will unfold."--BOOK JACKET.
Read More Show Less

Editorial Reviews

Publishers Weekly - Publisher's Weekly
Based on an article originally published in Wired, this book suffers from the expansion, as the authors have to keep finding ways of telling readers that things will be great in the near future. Schwartz is chairman of Global Business Network, a consulting firm; Leyden was managing editor of Wired; Hyatt teaches at the Stanford Graduate School of Business. Written with the same optimism about the economy as Dow 36,000 (Forecasts, Aug. 30), this sunny look at the future goes beyond the stock market to take an upbeat gander at the way we will live in the next century. To say that the authors are bullish is an understatement. Alternative energy sources, biotechnology and increased productivity figure prominently in their rosy scenario--the key to which is continued and extended economic growth in the developed world, which will trickle down to the developing world and create a global middle class. But in their zeal to describe how all parts of the world will participate in and benefit from the long boom, the authors make sweeping and potentially offensive generalizations: Asians, for example, while not good at "improvisation," are "extremely adept at mastering set courses and memorizing--far better than" Westerners. The authors are on safer ground discussing technology, but their attitude toward this future is entirely passive. They give the impression that we will all sit back and marvel at the forthcoming human accomplishments, and that this will provide the chief pleasure in the future. Yet their vision is exciting, and the authors articulate it with the panache of Alvin Toffler--and the kind of wide-eyed confidence in the future that characterized the 1939 World's Fair. (Oct.) Copyright 1999 Cahners Business Information.
From The Critics
In October 1989, the Atlantic Monthly published a cover story titled "The Coming Global Boom," in which author Charles R. Morris perversely argued that the 1990s would be a decade of great prosperity. Interest rates would fall, technological advances would boost productivity, and Japan, which then seemed invincible, would falter. The budget deficit? An accounting error - the states were in surplus. The trade deficit? A vote of confidence from foreign investors. Industrial policy? A waste of time.

Now remember, this was 1989. Morris hadn't just gone out on a limb. In those dark days he seemed perched on a mere twig. When he expanded his article into a book, a Wall Street Journal reviewer mocked him.

"If he really believes interest rates and inflation are coming down," wrote economist Susan Lee, "then locking in a lot of long-term Treasuries would be the way to go. But, no. Mr. Morris classifies those as speculative, and instead recommends that investors diversify their portfolio of stocks and invest for the long term. Money magazine couldn't have done better."

Morris wasn't just stupefyingly prescient (the Dow Jones Industrials are up some 350 percent since his article). He also helped to revive optimism as a legitimate strain of thought amid the Chicken Little chatter that inevitably dominates the media. (Remember Ravi Batra's Great Depression of the 1990s?) Currently, optimism's most persuasive upholder is Gregg Easterbrook, who has argued in the New Republic and elsewhere that things really are getting better, not worse.

Comes now a trio of technoseers who make Morris and Easterbrook seem downright gloomy. In The Long Boom, a book that grew out of a 1997 article in Wired, Peter Schwartz, Peter Leyden and Joel Hyatt lay out a vision of the future so rosy that readers will feel immersed in Pepto-Bismol. The authors believe that - thanks largely to the technological revolution now under way - we're about midway through a 40-year boom that could transform life as we know it.

I emphasize "could" because, despite all signs to the contrary, they insist their book is not a prediction, but rather a vision. Their goal is not only to let us know where we stand, but to get us doing what needs to be done to realize that vision. Thus, the authors outline 10 guiding principles to use in the new era. Since we are all busy, I will quote the highlights: "Go global in all things. Open up in every capacity. Let go of all tendencies to control. Grow more in both economic and personal terms. Always adapt to all the changes swirling around."

What does it mean to transform life as we know it? Well, by the middle of the next century, the authors assert, people will normally live to be 120. Air pollution will largely be cleaned up, thanks to hydrogen-powered fuel cells and the end of the internal combustion engine. Affluence will spread across the globe, and in places where they are still little more than chattel, women will finally approach equality. Expect a return to space, as well.

It's all about networking. Technology is facilitating the rapid exchange of people and ideas, bringing the world closer, undermining distant centralized authorities and fostering remarkable economic growth. The authors figure that the main threat to all this - aside from human perversity - is global warming, but they're convinced we can lick this one the same way the nations of the world got together to ban chemicals that were destroying the ozone layer.

The authors are learned and enthusiastic, but there's a lot to dislike in this book. It's hokey in places and simply written throughout - so much so that one feels almost patronized. Worse, it ignores many potential problems that could prove fatal to its hopeful scenario. Mutant viruses, nuclear proliferation and terrorism - to name just three - may flourish with disastrous consequences in the decades ahead, in part because of the rapid globalization and technological advances that the book extols.

There also are a few mistakes. There were fax machines before 1985 (remember the spinning drum? the acoustic coupler?), and Ronald Reagan neither "began to dismantle the old bureaucratic industrial economy" nor "tighten[ed] monetary policy in an attempt to rein in inflation." Guys, does the name Paul Volcker ring any bells? Tall fella, cigar, used to run the Federal Reserve? You couldn't miss him in those days. Reagan's policy, if you can call it that, was fiscal, not monetary, in the form of tax cuts without spending cuts.

Given all this, it's almost embarrassing to admit how much I agree with these guys. Optimism, after all, may not have much to do with evidence. It may be genetically predetermined, like height, or it may be easily learned, like accented speech. I happen to believe that these are the best times in human history, and I think the authors are onto something: If we play our cards right - a big "if," I'll admit - things will get even better.

I like to think this judgment is based on the evidence, and The Long Boom provides plenty. India by now has a middle class of 300 million, more than the entire population of the United States. Life expectancy for Americans during the 20th century rose from 47 years to 75. This is to say nothing of the collapse of communism in Europe, the relatively bloodless end of apartheid and the magical spread of democracy across South America. Painless dentistry, cheap airfares, falling crime, cleaner air, fewer traffic fatalities - one could go on and on.

It's easy to make fun of Pollyannas. But these days, it's a lot harder to refute them. The Long Boom is kind of a long haul, but what counts is being right in the end.


Daniel Akst writes frequently about business and technology. He is the author of St. Burl's Obituary, a novel.

Business Week
The Long Boom is a welcome relief from the pessimism of conventional economics.
Daniel Akst
Peter Schwartz, Peter Leyden and Joel Hyatt. . .believe that-thanks largely to the technological revolution now under way-we're about midway through a 40-year boom that could transform life as we know it. . . .Their goal is not only to let us know where we stand, but to get us doing what needs to be done to realize that vision.—The Industry Standard
Read More Show Less

Product Details

  • ISBN-13: 9780738200743
  • Publisher: Basic Books
  • Publication date: 9/9/1999
  • Pages: 352
  • Product dimensions: 6.43 (w) x 9.57 (h) x 1.21 (d)

Meet the Author


Peter Schwartz is co-founder and Chairman of the Global Business Network. Author of the international bestseller, The Art of the Long View, he lives in Berkeley, California.Peter Leyden, former managing editor of Wired magazine and special correspondent in Asia for Newsweek, has written and spoken about technology, economics and politics since the mid-1980s. He lives in Berkeley, California.Joel Hyatt teaches entrepreneurship at Stanford Business School and is Executive Managing Director of Idealab!, a leading incubator of Internet companies. He lives in Atherton, California. Eamonn Kelly is CEO and President of Global Business Network, a member of the Monitor Group. Kelly also heads GBN's consulting practice and has worked at senior levels with dozens of the world's leading corporations, as well as with global and national public agencies. He lives in San Rafael, California. Peter Leyden is Global Business Network's Knowledge Developer. Co-author of The Long Boom and former managing editor of Wired magazine, he also served as special correspondent in Asia for Newsweek. He lives in Berkeley, California. Visit the authors' website at www.gbn.com. Peter Schwartz is co-founder and Chairman of the Global Business Network. Author of the international bestseller, The Art of the Long View, he lives in Berkeley, California.Peter Leyden, former managing editor of Wired magazine and special correspondent in Asia for Newsweek, has written and spoken about technology, economics and politics since the mid-1980s. He lives in Berkeley, California.Joel Hyatt teaches entrepreneurship at Stanford Business School and is Executive Managing Director of Idealab!, a leading incubator of Internet companies. He lives in Atherton, California.
Read More Show Less

Read an Excerpt




Chapter One


The Great Enabler


Computer and telecommunication technologies are creating a fundamentally new infrastructure upon which our twenty-first-century world will be based. This technological base makes possible a much more efficient and productive knowledge economy and leads to rapid advances in other technologies and fields.


* * *


The single most important event to fall on the 1980 dateline was the introduction of the personal computer. Apple Computer had technically introduced its first personal computer in 1977, and the first IBM PC came out in 1981. At the time, the personal computer didn't seem like all that big a deal—despite the revolutionary rhetoric of its proponents. Apple's cofounder, Steve Jobs, had said the personal computer would change the world, and frankly, he was right. The introduction of the personal computer will probably go down in history with other key dates like the introduction of the printing press in the 1450s. The personal computer, like the printing press, had the potential to change almost everything. The personal computer was the key tool that allowed the "centralisms" of bureaucratic corporations and welfare states and communism to be undermined. We had had computers long before—ever since World War II—but they had been huge mainframe computers that centralized all tasks and were run by elite engineers and technicians. The significance of the personal computer was that it decentralized computer-processing power—and thus facilitated the decentralization ofdecisionmaking and raw power itself, which had previously tended to move toward the center. The personal computer was the key tool that began to reverse that trend and to allow the spread of power to the periphery. It would become The Great Enabler in more ways than one.

    To be sure, those first personal computers were no match for the mainframes in their processing power. But personal computers were riding an amazing developmental dynamic unheard of in other industries. This dynamic was known in the computer industry as Moore's law, after Gordon Moore, one of the founders of Intel, the maker of microprocessors, which are the brains of personal computers. Moore was the first to recognize that every eighteen months, on average, engineers were able to double the number of circuits and components that fit within the same computer chip, essentially doubling the speed, or power, of that chip. The effect of Moore's law was that every eighteen months, a new computer chip came out that essentially doubled in power for the same price. This dynamic has been playing out ever since the very first microprocessors were invented in the 1970s. The results are truly astounding: Microprocessors for today's $250 video games are as powerful as mainframe computers that cost $14 million in about 1985.

    Right before the advent of the personal computer, the movie Star Wars came out and created a modern mythology set in a distant time of interstellar space travel. The film has classic archetypal characters and deals with transcendent themes like the never-ending struggle between good and evil. The evil is personified in Darth Vader, a huge, hooded villain dressed in sweeping black robes. The good is personified in Luke Skywalker, a young, smart, struggling hero who pulls together a scrappy band of rebels with good hearts. In a strange way, that movie mirrored the drama that would play out in the technological world, a drama that, squeezed down to its essence, was the battle over technological openness. The bad guys were into control and proprietary standards. The good guys fought to open up technology for everyone. They fought for open standards, compatible technology, interoperable computer languages, and, ultimately, open source code. So far, the good guys have won each round, though we're at another crucial juncture right now. We argue that openness always wins in the end. The most open strategy in technology as well as in other areas like economics is always the one to bet on—especially in the long run. But the forces for evil, those trying to hang onto control, often seem invincible—until they fall.


Act I: IBM, Bad; Microsoft, Good


The technology drama has had several main acts, and as in all good drama, the good guys have sometimes turned into bad guys. But in the end, we could have a happy ending. In the first act, the curtain rises on 1980, with IBM as the bad guys. They controlled the computer world and came dangerously close to being deemed a monopoly. Because they were so big and powerful, many thought that the only countervailing force of sufficient power was the U.S. government. In fact, like two sumo wrestlers in an endless match, the government and IBM had been locked in an ongoing anti-trust lawsuit through the late 1970s. The real protagonist, though, turned out to be a character right out of Star Wars. In 1975, Bill Gates, with his college buddy Paul Allen, had founded a software company called Microsoft, selling an operating system for personal computers. These two approached IBM offering to provide their operating system for the personal computer that IBM was reluctantly going to manufacture using Intel's chips. IBM had little more than disdain for these personal computers, which seemed so pathetic compared to the powerful mainframes, but realized that the PC might provide yet another market to move into. So IBM enlisted Microsoft and Intel to help put together the IBM PC.

    Darth Vader had opened himself up to a mortal saber thrust. The personal computer exploded in popularity through the 1980s as businesses bought the increasingly powerful machines. Both Microsoft and Intel were perfectly positioned to ride that boom. They became known as Wintel because every personal computer sold with an Intel chip also came with Microsoft's evolved operating system, called Windows. And by the 1990s, more than 90 percent of all personal computers were Wintel. What happened to Apple Computer? It was supposed to be the revolutionary, and it was for much of the 1980s. But Apple and Microsoft played out a little minidrama through that decade around the same theme of technological openness. Those two companies had the two most prominent operating systems, and Apple's was technologically better, but this time, Apple stayed on the dark side of control and would put its software only on its own hardware: If you wanted the Apple software, you had to buy the Apple machine. Microsoft stayed out of the hardware business altogether and designed its software applications to fit any computers—including those of Apple. So Apple took the closed strategy and Microsoft took the open one. Apple ended up almost bankrupt and became a niche player. By the end of the 1990s, Microsoft had become the most valuable company in the United States, and Gates the richest man in the world. Open clearly won that time.

    Another key development took place around 1980 in another technology field: telecommunications. In 1982, the U.S. government, while grappling with IBM, entered into a consent decree that broke up the monolithic AT&T, the "centralism" that had dominated telecommunications for decades. That breakup of the Bell phone system, which formally happened in 1984, was the watershed event that unleashed a great wave of innovation in this crucial field. In the wake of that breakup, we saw the rise of competing long-distance phone firms like MCI and Sprint, which scrambled to build their own networks of high-capacity fiber-optic cables around the country. We saw the building of vast wireless phone networks that led to the mobile phone craze. We watched the construction of a parallel cable television network in every city in the country. In short, the 1980s produced an explosion of entrepreneurial zeal that had been long lost in telecommunications. And the best was yet to come as the telecommunications world began to align with the world of computers.

    Throughout the 1980s, the increasingly powerful personal computers had spread throughout the business world. Businesses, which were able to afford the relatively expensive new tools and could capitalize on the investment, became the early adopters of this new technology. By the early 1990s, though, a key development had taken place. The isolated personal computers began to be tied together in networks. At first, they were in small networks within offices, but soon they were networked between offices and within larger organizations. Ultimately they began getting tied together in even larger public networks. Until then, the personal computer had essentially been a glorified typewriter or a number cruncher. Through networking, the personal computer became a communicator, and that shift from calculator to communicator made all the difference in the world. Calculators are useful, but limited. Most people don't really want to sit around and crunch numbers. They want to communicate, and communication is at the heart of everything we humans do. So with a change in the way we communicate comes a change in the way we do almost everything else.

    The advent of the Internet, the networking together of all computers and the merging of the computer and the telecommunications worlds, led to an explosion of growth in both fields throughout the 1990s. Almost anything that had to do with digital technologies and telecommunications showed the same unmistakable growth through the 1990s. You can track the number of microprocessors sold, the feet of fiber-optic cable laid, the rise in cell phone subscriptions, the explosion in Web sites—anything to do with these fields has taken off since 1990. The number of Internet hosts, or servers, alone has been doubling almost every year since the introduction of the Mosaic Web browser in 1993. By 1998 that number had passed 30 million and was continuing to climb. The number of Internet users that year topped 160 million. Personal computers have saturated the business world, have moved from business into the home, and have begun spreading from the developed to the developing world.

    Impressive as the growth has been so far, the really big expansion is yet to come. The personal computer still appears in only about half of all American homes. Compared to the adoption of other key technologies like television, personal computers are only halfway up the adoption curve. And the adoption percentages for households are much less in almost all other developed countries—let alone the developing countries, which are only now adopting personal computers for their businesses. From a global perspective, the bulk of the growth in personal computers is clearly ahead. Also, microprocessors are increasingly being embedded in all the tools and appliances that we use in our lives. With time, the personal computer per se will fade in its significance, and these computerized tools will take over the functions of the old personal computer. But that shifting is just beginning, and the growth in the use of these embedded chips lies almost all ahead of us. Ultimately, in the first two decades of the twenty-first century, we will create what is called ubiquitous computing, an environment where we are surrounded by these cheap, powerful chips and almost effortlessly interact with them—taking them completely for granted.

    The forty-year period between the introduction of personal computers and the full arrival of ubiquitous computing is not arbitrary. Previous technology adoption patterns such as with television and radio had similar four-decade spans. Businesspeople in Silicon Valley frequently use a graphic called the technology adoption curve to show the slow ramping up of any new product's penetration in a market from zero to 100 percent. It starts out slowly with early adopters, then rapidly expands when the majority catch on, and then tapers off as even laggards finally buy the product. Historically, we've seen this pattern happen again and again. It takes roughly forty years for a fundamentally new technology to be fully adopted by a society. One of the primary reasons has less to do with technology and more to do with people. Over the course of forty years, societies see a complete changeover between generations. The generation that enters the era as young people in their twenties and thirties just out of formal training end the era moving into their sixties and seventies and letting go of their positions of power and entering retirement. Meanwhile, their children, who were born in the first couple of decades of the era, emerge at the end as young adults ready to make their full impact on the world. The new generation spends their entire young lives surrounded by the era's new technologies and steeped in the new ways of doing business and running the world. They implicitly understand the new system by the time they reach their twenties and thirties and begin to make meaningful contributions to the economy and society. When they take over, the whole system really works. In fact, one of the key reasons we define the Long Boom as a forty-year era is that a complete generational changeover will occur in that time span. That generational transfer will come on top of a full adoption of this revolutionary computer infrastructure.

    The growth in telecommunications is mostly ahead as well. We still haven't wired up even half the personal computers. Only about a third of adults in the United States use the Internet now. And we are still struggling to beef up the bandwidth, or capacity of the wires, to handle even adequate levels of digital traffic for basic graphics. We are at the beginning of a huge effort to greatly expand that capacity so that within the coming decade we will be able to move the huge computer files that carry video images and full-length movies over our landlines and into our homes. We're just now embarking on efforts to establish satellite networks that will allow phone and Internet access from every square foot of the planet. The Iridium global phone was the first of this generation to become operational in the fall of 1998. There are half a dozen more projects already in motion to compete with Iridium or to expand the capacity from phones to full, high-bandwidth Internet access. The most promising of these is the Teledesic project, due for completion in 2004 and partly bankrolled by none other than Bill Gates.

    Meanwhile, the telecommunications infrastructure is moving toward a common open standard, if you will. Until now, the main traffic on the global telecommunications grid has been the human voice, and telephone traffic has far overshadowed computer data. But that balance has started shifting during the 1990s—prompting a huge technological debate. Voice has always been transmitted via fixed circuits in the world of telephony. That means that a person placing a phone call to someone across the country first establishes an open connection that reserves a circuit of the telecommunications grid that literally spans that distance. Computer data, on the other hand, are transmitted via packets over the Internet and other protocols. All computer messages or transmitted data on the Net are broken into smaller bits, called packets, that then individually make their way across various routes in the telecommunications grid, to be reassembled at the final destination. These are fundamentally different approaches to using the telecommunications infrastructure, yet they are not equally useful. For one thing, telephony can be transmitted via packets and, many argue, much more efficiently. For another, all other media are going digital. Music is already digital on compact disks. Digital television is being developed in parallel to old analogue TV. Mobile phones are shifting to digital options. The common language is the bits and bytes of computer code. So we're starting to see some of the key players in the world of telephony positioning themselves for the leap. The world is now heading toward a future in which one common telecommunications infrastructure of a network of networks will be able to handle all communication. This development, too, will stimulate growth.

    Bill Gates understands better than most that we're on the front end of growth in digital technologies—not the back end. In a conversation in 1998, he remarked that everyone seems fixated on the growth Microsoft had enjoyed in the previous twenty-five years. True, by 1998 Microsoft was the world's most valuable company based on its astronomical stock prices, but Gates said the real growth all lay ahead. He compared the computer industry in the late 1990s to the auto industry in the late 1920s. The auto industry had enjoyed substantial growth in the previous twenty-five years, but nothing like the growth of the next twenty-five and beyond. The same holds true for the computer industry today. The real growth is ahead, not behind. What's more, that growth is inevitable. Inevitable. Gates attributed the success of his company to that insight. He said Microsoft simply "tracks the inevitable." It looks where the technology is obviously heading, then positions its business plan to correspond to that technological trajectory. Once the breakthrough was made, it was inevitable that computer power was going to decentralize into personal computers. Microsoft saw that decentralization early and capitalized on it. Once the power of networked computers was demonstrated, it was inevitable that all computers would be wired up through the Internet. When Microsoft recognized that, it rapidly repositioned its business strategy. Looking forward, it's inevitable that computer chips will spread throughout the business world, into our homes, and across the planet. They aren't going to stop spreading. We aren't going to quit computerizing. You can count on it. It's inevitable.


Act II: Microsoft, Bad; Silicon Valley, Good


As in any good story, though, Act II, in the 1990s, brought a surprising role reversal—in this case involving Gates and his company. Microsoft had grown into an upgraded version of IBM: the control freak, the bad guy. Microsoft sat on top of a virtual cash machine because the costs of replicating software were minimal, yet Microsoft could charge a premium for its software, which every machine needed. So it regularly would rack up after-tax profits of between 30-40 percent of revenues, and it had cash reserves of billions of dollars to hire any top programmer, or buy any promising company, or move into any new market. Gates had long ago moved his company to his hometown of Seattle, which became a burgeoning base for the Microsoft crowd. The perceived good guys at this time were the network of smaller companies rooted in California's Silicon Valley, though they included some companies technically outside the area. In effect, this camp was the anybody-but-Microsoft crew, and in an odd twist of fate, a wised-up IBM threw its still substantial weight behind these innovators. What unified this coalition, beyond fear and hatred of Microsoft, was the belief that the Internet had shifted the paradigm of computing. They believed the Net, or the network, would undermine the importance of Microsoft's operating system by moving beyond the world of isolated personal computers. When every computer was an island, Microsoft ruled the land. But once all computers became connected, many other possibilities open up.

    The first major shot fired in this battle came from Netscape, a company that produced the first user-friendly interface with the World Wide Web. The Web allowed a graphic dimension to the Internet, which until then had been all text. Once the Internet became visual and easy to navigate, its popularity took off. Netscape rode that initial takeoff and became a very valuable company almost overnight. But Gates shifted strategy quickly, coming up with a competing Web browser, and the huge base of Windows users started migrating to it. Netscape's fortunes soon began to fade. The next major shot from the Silicon Valley coalition came from Sun Microsystems, which came up with a computer language called Java that was specially suited to the Internet and that could work on any machine. Sun preached that "the Network is the Computer," meaning that personal computers would increasingly do less work on their own and would connect with larger computers, called servers, that would do the bulk of the work. As a result, the operating system and the applications on the personal computer would diminish in importance, as would Microsoft, and applications that worked over the Internet would increase in importance, as would the servers, which Sun happened to make. Alas, Microsoft made server software, too, and Microsoft's marketing muscle brought it an expanding piece of that pie. And then Microsoft adopted the Java language but developed a modified version that worked best with the rest of the Microsoft applications suite.

    There were other major thrusts and parries throughout the 1990s. No matter what the Silicon Valley coalition did, Microsoft somehow countered. So the Silicon Valley crew eventually turned to its last weapon. This group, which had often prided itself on its libertarian impulses and its disregard for government, looked to Uncle Sam for help at the end of the 1990s. As in the battle with IBM twenty years before, the U.S. government seemed to be the only player big enough to take on Microsoft. But as in that other saga, the government may not provide the real answer. In the end, a new cast of characters probably will.


ACT III: Open Source, Good


Act III in the battle over technological openness is taking place right now and is starring some new actors. The good guys are a ragtag band of programmers, hackers, and idealistic computer nerds spread all over the world. Some of them work in big companies, some in smaller firms, and many of them on their own. The label ragtag is no reflection on their skills. They are as smart as anybody when it comes to understanding computers—as smart as the best that Microsoft or Silicon Valley has. It's just that these people are completely into the technology, not just making money from their skills. They just love the challenge of creating great code, sharing their discoveries with a community of peers, and getting the feedback that will improve their innovations. They pay their bills with what could be called day jobs, but their real passion is the code, the code. The software innovations that they design in the day to solve their particular problems, they then pass on to the community over the Net at night. Much of their spare time is spent working on collective software projects. The Net has become the means of unifying this network of techies. It is the tool through which these people self-organize.

    There has long been a code of honor on the Internet to share ideas for the greater good. When the Internet was little more than an academic medium, this code of honor simply continued the tradition of the scientific method. In science, those who make a discovery are obligated to share the results and to explain clearly how they arrived at their conclusions, so that the entire scientific community can subject the discovery to rigorous analysis. If the discovery passes that scrutiny, it becomes common intellectual property, and everyone gets to reap the benefits. This mentality is not due to some socialistic impulse but is the best way that science can move forward. If the results of one person's discovery are quickly distributed, then no time is wasted in rediscovering that piece of knowledge. It can just be used, refined, built on, or taken in another direction that pushes the boundaries of collective knowledge further still. Many computer programmers work the same way. When they write a great piece of code that solves a common problem, they want to share that knowledge, partly for the kudos but mostly because they want to help everyone out. If they figure out a software patch that fixes a bug in an existing program, why hide the results? The benefit of having the exclusive solution is minimal in the long run. Besides, if everyone hides results, everyone fails to benefit from the thousands of great ideas being discovered all the time. If we all share our code and our ongoing innovations, we all get to benefit from the great collective mind, and together we'll be solving thousands of problems simultaneously.

    This open, collective approach is very powerful. As the Net has spread and the number of contributing programmers has expanded, the brainpower of this group has begun to rival that in the biggest corporations, including the top firms in Silicon Valley and even mighty Microsoft. The quality of this group's software has begun reaching the level of that put out by highly organized corporations. The corporate software goes through rigorous quality control. The collective software, though, attains a similar rigor through the experiences of many thousands of independent eyeballs poring over it. And you certainly can't beat the price. The collective software is absolutely free. Nobody owns it, or, rather, everybody does. In fact, at first this kind of software was called free software, and the people who created it called themselves the free software movement. But the average layperson's notion of anything "free" is that it is less valuable, somehow inferior, and this software isn't. So over time the names have evolved into open source software and the open source movement. The terms refer to opening up to everyone the source code that is the basis of any software program. Once you can see the source code, which explains exactly how a software program works, you can understand why the program performs as it does—and how to modify or improve it. You can't see the source code of software designed by corporations like Microsoft. Their code is completely proprietary, so you always have to go to them and pay them for modifications or for solutions to your problems. To growing numbers of techies, this way of operating is unacceptable.

    An open source operating system called Linux emerged in the late 1990s to compete with Microsoft's Windows. The core of the program was designed in the early 1990s by a Finnish graduate student, Linus Torvalds, to operate on all personal computers. The system has now grown to include hundreds of other open source packages that have improved and expanded the system's capabilities. By 1998, Linux was claiming more than 7 million users and was the world's fastest-growing operating system. Meanwhile, an open source Web server called Apache was running on more than 50 percent of the Web servers exposed to the Internet. This software had been developed by and was being maintained by a core group of about a dozen Web site administrators and an active group of contributing users. For a while, the corporate world ignored these and a growing number of other open source options. But with the continued success of the open source movement and its increased exposure in the mainstream media, that corporate resistance has begun to shift. The drama has begun to unfold.


* * *


Twenty-First Century Choices:
The Making of a Long Boom


Part One: "Openness Wins"


Welcome to our documentary series "Twenty-first Century Choices: The Making of a Long Boom." I'm your host, Salma Aboulahoud, and like many of you viewers today in 2050, I lived through the turbulent two decades after the turn of the century, the 2000s and 2010s. I was a girl coming of age in Egypt just as the new millennium dawned. I remember well the anxiety of those times.

    Those first years of the new millennium were ones of explosive growth and rapid innovation. They were filled with widespread changes that rolled through almost every field. And they brought their share of trauma, including moments of great peril when it wasn't clear the global community would rise to the challenges.

    Our eight-part series will look back at the major stories that defined those times and led to the creation of ours. We will look at some of the critical choices that helped produce a half century of peace and prosperity.

    We'll look at how the people in the first part of this century made many farsighted choices that proved to be the right ones. Far more often than not, they chose the more open and inclusive alternatives that led to greater integration and higher growth. We'll look at how they struggled to restructure what was a new global economy, and to build a bold new politics, and to create the flexible learning society.

    However, we'll also look at how they might have gotten it wrong. At several key junctures, they could have chosen more closed or hostile alternatives that might have produced decades of horror like the first half of the twentieth century.

    We'll tell the stories of how they solved the Caspian Sea Crisis and fixed global warming in the wake of the Shanghai Winter of 2012. We'll remember their efforts to launch the new Space Age with the first mission to Mars. And we'll revisit their attempts to redress the gross inequities between the world's affluent and poor, a division that is much less severe today.

    But we'll start our series with a look back on the construction of the technological infrastructure that now lies at the foundation of our twenty-first Century society. It was quite a feat to computerize and connect the entire world in such a short time span—just a couple of decades. But it was all the more important because it enabled many of the unexpected developments that were to come. We owe them much.


The Chips Spread


It's funny, but today we don't even think about computer technology. Back then, in the late twentieth century, people were obsessed with it. There were hundreds of magazines about all the latest gadgetry and whole television programs about "the computer and you."

    It was like the early twentieth-century obsession with electric motors. By the end of the century, no one even thought about the electricity coursing through all walls or about the tiny electric motors silently powering a home's fifty or so appliances—from food blenders to dishwashers to every clock on the wall. We feel the same way now about computers: They are just tiny chips embedded in everything and connected together wirelessly. We don't notice them or even think about them.

    At the end of the twentieth century, the ultimate goal of the technology world was to spread computers and computer chips around the entire globe and to connect them all together. This was the dream of technologists and an increasing number of businesspeople, but it was by no means a done deal. Some crucial choices had to be made. That megaproject can be understood as three major stories: in computer hardware, telecommunications, and software. The overall project was largely completed by 2015, though some of the final solutions were unpredictable until the very end.

    First comes the hardware story, which concerns the basic machines. Just after the turn of the century, basic personal computers approached the cost of a good television set. The razor-thin profit margins put computer manufacturers under extreme competitive pressure but allowed businesses and families in developing countries to buy computers—which boosted the number of global sales and kept revenues flowing.

    Meanwhile, the first half of the first decade saw advances in chip technology that increased the downward pressure on prices and kept boosting power. By 2006 virtually every new top-of-the-line personal computer was driven by a chip 100 times more powerful than the chips of the 1990s. The chips also kept shrinking in size, and by 2002 all a computer's functions could be housed on a chip the size of the head of a screw. This shrinking helped propel a proliferation of small computerized devices that unbundled the functions that had previously been the sole province of the PC.

    The net effect was incredible mobility. Essentially by 2005 computers had become extremely small, extremely powerful, and extremely cheap. Combined with a vast expansion of wireless telecommunications, all computer users became free to roam.

    The exploding power also allowed computers to become much easier to use. The most stunning break came with reliable voice recognition in the early part of the decade. The keyboard had long kept the growth of computing in check. Many highly literate people had never learned to type; illiterate or barely literate people had no way to interact. And, of course, most keyboards were designed for European languages, especially English.

    By 2005, the majority of new computerized devices were voice-enabled and could recognize a limited vocabulary. By the end of the decade, powerful dictation systems caught nearly every word, even in languages not well suited to keyboards. This advance was a special boon to the Chinese, whose language is based on characters representing words rather than on letters. The 4,000 character keyboard was a thing of the past.

    The final hardware threshold was crossed with the development of high-resolution flat screens, which approached the crispness of paper. Experimental flat screens had been developed in the 1990s, but their price didn't drop to reasonable levels until 2003.

    Flat screens of all kinds soon appeared everywhere: in displays at work, in walls at home, in automobiles. Some were flexible like electronic paper; some were large and rigid—the size of entire walls of buildings. These became the many windows into the myriad corners of the world that came with the astounding developments in telecommunications.


Free Communications


The telecommunications story in the late twentieth century was all about bandwidth—as it was called in those days. Computer data at that time traveled in a relative trickle to many small businesses and almost all homes. This bandwidth bottleneck caused much consternation at the time because so much of the promise of networking depended on moving large files.

    Many competing projects were initiated in parallel to deal with this problem. This proliferation of ideas became both a problem and, in the end, a boon. It was a problem for regulators because the different systems needed to be standardized and interoperable. It was a problem for individual companies because the fierce competition caused many to lose money. But it was a boon to everyone else because bandwidth, indeed, became plentiful—and then some.

    By the middle of the decade in North America and Europe, half the population, depending on the density of the area, had high-speed access, some via telephone lines, some via cable modem, some by wireless connections, some by satellite connections.

    The trends toward increasing mobility and increasing bandwidth came together in the arrival of overlapping satellite systems just after 2005. By that time, there were eight different competing systems providing major conduits for both voice and video data traffic. When the projects had been initiated in the 1990s, they were all highly proprietary and cutthroat—expecting to corner a piece of the market.

    This competition spurred the early arrival of the means to cheaply connect to the new digital infrastructure from anyplace in the world. But the individual companies found their markets all merging, which led to a temporary glut of capacity, a big shakeout, and eventual consolidation. So around 2009, there was no meaningful distinction among any of them. They all had become just one more strand in the Net.

    This building out the capacity of the digital pipes, if you will, would have been welcome enough. But at the same time, many firms were competing to compress digital files to stream through the bottlenecks. These two developments combined to bring an unexpected big bang in telecommunications in 2007.

    Telecommunications in general and bandwidth in particular became so cheap that no one thought about the cost. There was so much capacity that virtually anything could be sent over the Net. Demand soared in what became known as the Telecom Takeoff.

    Anybody with any idea about how to take advantage of the bandwidth had the opportunity to do so. The most visible development was the proliferation of full-motion video connections between millions of nodes in the Net.

    This essentially opened up huge video windows between ordinary people on distant parts of the planet. With screens the size of walls, people in the frigid Arctic Circle could look out the equivalent of their back doors and experience a beach bar in the balmy Bahamas.


Open versus Closed Software


The final piece in the equation of creating a global network of computers had to do with software. The hardware was spreading as fast as possible. The capacity of the infrastructure was also expanding exponentially. But software was what tied everything together—or did not.

    At the turn of the century, the big software battle was between free and proprietary software. If the open source movement, as it was called, could consistently produce high-quality software for free, then why would anybody pay good money to the proprietary crowd? Microsoft had the most to lose by moving in the direction of free software. The Silicon Valley group, though, potentially had the most to gain. If it moved early to give away software and expose its source code, it might be able to dramatically shift the status quo and unseat Microsoft. But that move would mean coming up with new strategies for making money around freely distributing the basic software.

    At the time, there were several models for creating value through services and ongoing relationships, but they were largely untested and quite risky. The breakthrough came not long after the turn of the century, after the spread of the open source software had taken off. The Anybody-But-Microsoft crowd took a deep breath and adopted the new model.

    The timing could not have been more fortuitous. The entire developing world was ready to take a leap into the digital age, yet the costs had been prohibitive. How was some poor person in Bangladesh going to come up with $100 to pay Bill Gates for the use of his operating system—or $200 more for every application? Suddenly people found out that software was now free. At first, they couldn't believe it. Why would anyone do that? Then, when the suspicion dissipated, they became quite voracious in their use of it.

    In addition, a de facto amnesty for software piracy was declared. Piracy had become a huge problem for software companies in the last part of the twentieth century, and they had spent enormous amounts of money trying to stop it. With the advent of open source software, the crackdown didn't make sense. In fact, use of a company's software was an asset: A business wanted people to spread its code—and its name—around. The wide acceptability of a company's software enabled the introduction of new products and services that were the real source of economic gain.

    So by 2006, a common base of open, interoperable computer code spread across the planet to run virtually all digital machines. This had been the goal from the very beginning. The entire world was computerized, and all the machines could talk to each other and interact. With the shift to open source software and the accompanying explosion of growth, the goal was attained.

    What had happened to Bill Gates? By the turn of the century, he had moved into the league of the great barons of the earlier Industrial Age, the Rockefellers and Carnegies, and he started becoming a serious philanthropist, donating billions to eradicating childhood diseases in developing countries. Gates wanted to be perceived as a good guy who had helped bring the world into the computer age. That's what he had wanted from the beginning. He hadn't gotten into software simply to make money; he had been like all those early young dreamers wanting to bring about a computer revolution. Yet Gates could not shake his sinister image.

    There was one daring way forward: Gates could let go of his control and embrace the open route, the open source software model. He could let go of his ambitions for more proprietary control of the basic software running on desktops, and on the Internet, and on handheld gadgets. He could even release the source code for Windows and therefore his so-called monopoly of the personal computer world.

    This option could actually help solve some nagging problems. Computer code was becoming so complex that the best way to solve the problems arising from complexity was to open the code up to all across the Net. Also customers were demanding interoperability and the ability to tailor code to their needs. By putting its code on the Internet, Microsoft would fully move into the Internet era and leave behind the PC era that it had dominated. The company would then compete on the quality and price of its applications that ran on the common base code. It would rise and fall on its service and the genuine value it brought to the marketplace. It would rely on its ability to constantly innovate and reinvent itself. Microsoft was a true company of the New Economy. It had more brainpower and resources than virtually any other organization out there. If it couldn't move confidently into an uncertain future, who could?

    The bulk of Microsoft employees wanted to move in that more risky direction. Many of them were top programmers who had grown up in the open tradition of the software world. They wanted to work on the cutting edge of the computer revolution, not to be the caretakers of the old code. The problem for Microsoft was its shareholders. The company's astronomical stock values were based partly on the strategic advantages of its Windows software, and any move to give it away would prompt a shareholder mutiny and outright lawsuits.

    Luckily for Gates and company, the U.S. government stepped in. The Justice Department's antitrust suit, begun in the late 1990s, ended with a deal that broke Microsoft up into two companies and forced it to open its operating systems source codes. One was a company that kept improving the operating systems for personal computers and other computerized devices. The other company was the one that would compete in applications, services, and other new markets in the New Economy.

    Though they complained bitterly at the time, the Microsoft shareholders actually profited handsomely in the long run. Both successor companies went on to thrive—particularly the New Economy one. The result was very reminiscent of what had happened to the shareholders of the old AT&T with the breakup of Ma Bell at the beginning of the economic boom in the 1980s. They received stock in the regional Baby Bells, which took off in growth. And the remaining parent long-distance telephone company also woke from its slumber and kicked into growth, eventually splitting into another three companies that further enriched the shareholders.

    Wise public policy worked both times. Everyone, inside and outside the companies, benefited. None of them benefited as much as a rehabilitated Gates. His shift in strategy brought a dramatic improvement in his reputation. He could take his rightful place in helping the world make its difficult transition. He was the man in black no more.

Read More Show Less

Table of Contents

Introduction: The Historic Moment 1
1 The Great Enabler 19
2 The Millenial Transition 37
3 The New American Ideology 65
4 The New European Renaissance 91
5 Asia Rises Again 109
6 The Global Challenges 133
7 Saving the Planet 149
8 Dawn of the Hydrogen Age 171
9 Technology Emulates Nature 187
10 Prepare for Wild Science 211
11 The New Global Middle Class 229
12 The Emergence of Women 241
13 The Guiding Principles 255
14 The Choice 281
A Memo to the President-Elect 291
The Story of the Idea 295
Notes 303
Selected Bibliography 315
About the Authors 321
Index 323
Read More Show Less

Customer Reviews

Be the first to write a review
( 0 )
Rating Distribution

5 Star

(0)

4 Star

(0)

3 Star

(0)

2 Star

(0)

1 Star

(0)

Your Rating:

Your Name: Create a Pen Name or

Barnes & Noble.com Review Rules

Our reader reviews allow you to share your comments on titles you liked, or didn't, with others. By submitting an online review, you are representing to Barnes & Noble.com that all information contained in your review is original and accurate in all respects, and that the submission of such content by you and the posting of such content by Barnes & Noble.com does not and will not violate the rights of any third party. Please follow the rules below to help ensure that your review can be posted.

Reviews by Our Customers Under the Age of 13

We highly value and respect everyone's opinion concerning the titles we offer. However, we cannot allow persons under the age of 13 to have accounts at BN.com or to post customer reviews. Please see our Terms of Use for more details.

What to exclude from your review:

Please do not write about reviews, commentary, or information posted on the product page. If you see any errors in the information on the product page, please send us an email.

Reviews should not contain any of the following:

  • - HTML tags, profanity, obscenities, vulgarities, or comments that defame anyone
  • - Time-sensitive information such as tour dates, signings, lectures, etc.
  • - Single-word reviews. Other people will read your review to discover why you liked or didn't like the title. Be descriptive.
  • - Comments focusing on the author or that may ruin the ending for others
  • - Phone numbers, addresses, URLs
  • - Pricing and availability information or alternative ordering information
  • - Advertisements or commercial solicitation

Reminder:

  • - By submitting a review, you grant to Barnes & Noble.com and its sublicensees the royalty-free, perpetual, irrevocable right and license to use the review in accordance with the Barnes & Noble.com Terms of Use.
  • - Barnes & Noble.com reserves the right not to post any review -- particularly those that do not follow the terms and conditions of these Rules. Barnes & Noble.com also reserves the right to remove any review at any time without notice.
  • - See Terms of Use for other conditions and disclaimers.
Search for Products You'd Like to Recommend

Recommend other products that relate to your review. Just search for them below and share!

Create a Pen Name

Your Pen Name is your unique identity on BN.com. It will appear on the reviews you write and other website activities. Your Pen Name cannot be edited, changed or deleted once submitted.

 
Your Pen Name can be any combination of alphanumeric characters (plus - and _), and must be at least two characters long.

Continue Anonymously
Sort by: Showing 1 Customer Reviews
  • Anonymous

    Posted October 11, 1999

    Should Have Been Written.........And All Should Read!

    . 'The Long Boom'.....simply an awesome visionary GLOBAL BLUEPRINT ...... professional career and life shaping content...... By 2050, the world may come back and cite the book's authors as the forefathers of the course vision which attempts germination of harmonious global community convergence. ....establishes a practical mindset of vast scope political, social, economic, global inter-dynamics ...should be classified must read material at every worldwide higher learning institution, in the same manner that Ethics and Business Policy is required at every MBA program as capstone courses. Very well repected Washington Kiplinger Editors offer similar book reporting similar trends in Global Business, Geo-politics and Technology.... ....'World Boom Ahead'.... ...never expect a guarranty how mankind will advance globally......but plenty of room for thoughtful, practical blueprints and scenario planning..... ...offers very practical advice for world's business and political leaders....not intended to be a detailed roadmap for global destiny.... If for the first 50 years of the world's 21st century mankind aligns toward this course blueprint, then mankind has the opportunity to make up for the tragic first 50 years of the world's 20th century. All need to read this book so as to know what policies to be demanding from their local, state, regional, federal politicians and business enterprise leaders.

    Was this review helpful? Yes  No   Report this review
Sort by: Showing 1 Customer Reviews

If you find inappropriate content, please report it to Barnes & Noble
Why is this product inappropriate?
Comments (optional)