Engines Of Tomorrow: How The Worlds Best Companies Are Using Their Research Labs To Win The Future

Engines Of Tomorrow: How The Worlds Best Companies Are Using Their Research Labs To Win The Future

by Robert Buderi

NOOK Book(eBook)

$12.99
View All Available Formats & Editions

Available on Compatible NOOK Devices and the free NOOK Apps.
WANT A NOOK?  Explore Now

Overview

The U.S. economy is the envy of the world, and the key to its success is technological innovation. In this fascinating and in-depth account reported from three continents, Robert Buderi turns the spotlight on corporate research and the management of innovation that is helping drive the economy's robust growth. Here are firsthand communiqués from inside the labs of a reborn IBM, resurgent GE and Lucent, research upstarts Intel and Microsoft, and other leading American firms -- as well as top European and Japanese competitors.
It was only a few years ago that competitiveness experts -- U.S. well-wishers and naysayers alike -- concluded that America had lost its business and technological edge. The nation's companies, they asserted, couldn't match the development and manufacturing efficiency of overseas rivals. Yet now the nation is humming along, riding an unparalleled wave of innovation.
Buderi tells us this turnaround has come on many fronts -- in marketing, sales, manufacturing, and the creation of start-up companies. But Engines of Tomorrow deals with a central element that has gone largely unexamined: corporate research. It's the research process that provides the technologies that spur growth. Research is behind the renaissance of IBM, the stunning growth of Lucent, and much of the steamrolling American recovery.
Focusing on the fast-moving communications-computer-electronics sector, Buderi profiles some of the world's leading thinkers on innovation, talks with top inventors, and describes the exciting technologies coming down the pike -- from information appliances to electronic security and quantum computing. In the process, he examines the vital strategic issues in which central labs play a determining role, including:
  • How IBM's eight labs around the world figure in Lou Gerstner's plans to achieve consistent double-digit growth -- and to join GE as a $100 billion concern.
  • Why Xerox's famed Palo Alto Research Center is vying to resuscitate its company's lagging fortunes by sending anthropologists into the field to study the hidden ways people really work.
  • What Hewlett-Packard will do without its original instrument business, recently spun off as Agilent Technologies. The business was central to HP Labs' MC2 philosophy of merging research expertise in measurement, computation, and communication -- and its departure removed a lot that was unique about HP.
  • How the November 1999 federal court finding that Microsoft operates a monopoly hinders the Seattle giant's acquisition plans and makes it increasingly vital for nine-year-old Microsoft Research to lead the way in innovating from within. Could this be the next great lab for the twenty-first century?

With authority and undaunted optimism about the underlying vitality of the research process, Buderi discusses these issues and reveals the future of some of the world's best and most powerful companies.

Product Details

ISBN-13: 9780743212489
Publisher: Simon & Schuster
Publication date: 07/14/2000
Sold by: SIMON & SCHUSTER
Format: NOOK Book
Pages: 448
File size: 2 MB

About the Author

Robert Buderi, a Fellow in MIT's Center for International Studies, is the author of two acclaimed books, Engines of Tomorrow, about corporate innovation, and The Invention That Changed the World, about a secret lab at MIT in World War II. He lives in Cambridge, Massachusetts.

Read an Excerpt


Introduction: Change

"The value and even the mark of true science consists, in my opinion, in the useful inventions which can be derived from it."

-- G. W. LEIBNIZ

"Science today is everybody's business."

-- I. BERNARD COHEN

The prolific rags-to-riches inventor Charles F. Kettering, backbone of the mighty General Motors research machine that gave the world four-wheel brakes and the refrigerant Freon, liked to call a company's research house its "change-making department." In a 1929 speech to members of the United States Chamber of Commerce, he put it this way: "I am not pleading with you to make changes. I am telling you you have got to make them -- not because I say so, but because old Father Time will take care of you if you don't change. Advancing waves of other people's progress sweep over the unchanging man and wash him out. Consequently, you need to organize a department of systematic change-making."

Corporate change-making, in the form of a company's central research operation, is the focus of this book. As a pivotal force behind a plethora of key industries, research can be crucial not only to winning in the marketplace but to national economic viability. Yet serious discussion of the subject -- examining the nature and evolution of successful research, and who still manages to pull it off -- has been missing from virtually every popular management treatise in vogue today. This failure has become all the more glaring amidst the sweeping cutbacks in research spending that marked the early and mid-1990s. The upheaval has led to the widespread perception -- among Congress, the President, leading policymakers, academics, and the press -- that companies have almost unilaterally, and shortsightedly, pulled back from far-ranging and potentially revolutionary studies to concentrate on incremental improvements to existing products.

This book challenges that perception by throwing open the spotlight on corporate research -- laying out its history, identifying the real issues, and delving inside the labs of the very best practitioners to illuminate the various approaches to managing innovation in today's quicksilver global economy. The focus is on the rapidly blurring electronics-computer-telecommunications industries. It can be argued that especially with the rise of the Internet, which is bringing together voice, data, telephony, computers, and a variety of office technologies, these fields impact our daily lives more profoundly, or at least far more visibly, than other industries. But more than that, these sectors have undergone the most upheaval in recent years, in the process becoming the focal point of the vigorous public debate about research and development policy and the future of competitiveness. The hard choices companies have made -- about spending, targeting resources, and managing innovation across geographic and cultural boundaries -- hold crucial lessons for what lies ahead in a fast-changing world.

Industrial research first appeared in the 1870s and quickly took its place as a hallmark of the industrial age. Companies learned that, when successful, systematic research could provide a huge edge. It helped fight fires, protect core business lines in a myriad of small and often hidden ways, and even create powerful new industries -- like the radio, wireless communications, and television empires that grew out of early twentieth- century electrical investigations at General Electric and telephony studies at American Telephone & Telegraph.

Taken as a whole, these innovations helped forge a strong sense, not just from Charles Kettering, but many other scions of industry and government, that corporate research could play a vital role in competitiveness. But despite a widely accepted view that research was important to corporate well-being and vigor, it wasn't until the 1950s and early 1960s that things really took off. That was when young Massachusetts Institute of Technology assistant professor Robert Solow showed mathematically -- the basis for his 1987 Nobel Prize -- that in the long run growth in gross national product per worker is not due so much to capital investment, the traditional pillar of classic economic analyses. Instead, such growth is in very large part the result of technological progress. In a nutshell, advanced economies grow largely on the strength of what one can call New Knowledge -- in most cases the high-technology fruits of research and development.

Although it was not high in the minds of research leaders, Solow's work debuted amidst the Cold War and the resulting surge in government and industrial R&D spending that helped shift inventive activity away from straightforward mechanical engineering to an affair deeply rooted in science -- primarily physics and chemistry. With that transformation came a dramatic increase in the growth of corporate labs, to the point that the Washington-based Industrial Research Institute now counts close to 15,000 corporate labs in the United States alone. These organizations employ an estimated 750,000 scientists and engineers -- some 70 percent of the total number of these professionals -- and spend roughly $150 billion annually on research and development. All told, industry funds about 65 percent of the R&D conducted in the country, and performs closer to three-quarters. And this is only part of the story. In rough terms, the U.S. spending is matched by the rest of the world combined -- meaning the saga of corporate research today is truly a worldwide tale.

The research story goes far beyond corporations, of course. The drive for successful innovation encompasses government labs, universities, and even still the basement inventor -- with the outcome dependent on a staggering array of factors from grade school education to state and federal tax credits, national R&D spending policies, and patent laws. However, companies reside on the leading edge of this ongoing, global struggle. They are the prime venue where New Knowledge is converted into Useful Products, and where success and failure can be most plainly gauged in terms of patents, market share, sales, stock prices, and the like. By focusing on this front, I hope to provide deep insights into invention and innovation today -- from the trials and tribulations of management to what's coming down the pike, and who has the edge in key technology-driven industries that will shape the way we live in the twenty-first century. To do this, I've gone inside the labs of some of the biggest names in innovation on three continents: IBM, Siemens, NEC, General Electric, Lucent Technologies, Xerox, Hewlett-Packard, Intel, and Microsoft.

Of course, there is absolutely no assurance that the companies, or even the nations, whose labs generate key inventions are the ones who will reap the economic rewards. On the back of key discoveries, Britain served as the initial center of the chemical industry; but it was the Germans who rose to control global markets for the half century leading up to the First World War. Similarly, the United States became a leading economic power long before it achieved its present position of research and scientific dominance. The annals of industrial research history are rife with examples of usurpers riding to glory on the waves of other people's innovations.

Marketing, sales, service, packaging, distribution, manufacturing, and development, the close sibling of research, all remain vital cogs in the wheel of corporate fortune. Which is most important? Well, it's like the case of the drunken juggler, notes John A. Armstrong, a former IBM vice president for science and technology. "Let's say the juggler has two tennis balls and a flaming sword, and a club," he supposes. "Now you say, which one of those things is most important? Well the thing that's most important is the next one he's got to catch. And success is keeping them all up in the air."

The point is that a host of other factors can outstrip the benefits of even the strongest research program. Yet if wielded effectively, research can sharpen the vision of an otherwise all-too-hazy future, spurring innovation and raising the chances for long-term success. This book focuses on the management philosophies, funding paradigms, incentive programs, and all the rest employed by the best labs to do just that. There is no single formula; people have found many paths to creating and maintaining vital research organizations. Which one is most successful depends on such factors as the economic times, the industry, and the firm's role in it -- as well as individual corporate culture.

Indeed, fitting into that culture -- not just of the research organization but of the entire company -- and helping scientists and engineers see themselves as part of a larger whole, turns out to be one of the keys to successful industrial research. Some company research arms, like Hewlett-Packard's, possess a strong sense of history and a deep connection to the rest of the organization -- an enviable situation made easier because the corporation itself is only fifty years old, and by the fact that the founders and all the chief executives until Carly Fiorina took over in mid-1999 have been engineers. An outfit such as Bell Labs research, by contrast, nurtures an equally strong culture but became increasingly disconnected from the rest of the company, partly as a result of the sweeping success of its scientific investigations that transformed a corporate laboratory into a university-like institution. Combined with years of uncertainty over the future of AT&T itself, the result was a kind of slow death that was only averted by managers taking drastic action. Even then it wasn't until the core of Bell Labs found a new home in a separate company, Lucent Technologies, that it rekindled the old spirit of innovation and reasserted its standing as one of the world's great industrial laboratories.

In many ways, the book is an extension of my work as BusinessWeek's technology editor, where I oversaw coverage of corporate labs worldwide and planned and wrote major stories on innovation and research strategy, including those involving the now-defunct R&D Scoreboard, which tracked the research and development expenditures of individual firms in a variety of industries. But it is also greatly influenced by my research for a previous book, The Invention That Changed the World, which told the story of World War II radar development at M.I.T.'s top-secret Radiation Laboratory.

In a sense, the Rad Lab was the first Manhattan project. The nation's top physicists were recruited to the enterprise, mixing with engineers, mathematicians, and even biologists and astronomers, to develop radar systems that had a dramatic effect on the course of the war. Even more than that, though, the radar work played a critical role in the evolution of postwar science and technology. Besides having a direct impact on specific landmark creations and discoveries -- including the transistor, nuclear magnetic resonance, wireless communications, microwave ovens, radio astronomy, and the maser and laser -- the radar effort helped spark a whole new attitude about the management of research as a fast-paced, collaborative, and cross-disciplinary affair. Seeing this crucial aspect of the evolution of corporate research greatly increased my insight into what went on in corporate labs throughout the 1990s. Indeed, a lack of historical and evolutionary focus has far too often caused corporate research to be viewed simplistically. Many of the common perceptions are old, or wrong -- and myths and easy stereotypes abound about the innovation process, as well as the very role and importance of research in modern corporations.

One of the most fundamental of these misconceptions is the treatment of research and development as if they were one thing -- namely, an enterprise called R&D. This is an easy trap to fall into, because spending on these two variables is always tracked together, and the goal of nearly every corporation is to fuse them better, so that products flow more quickly out of the lab into the marketplace. Nevertheless, the two are vastly different -- in both substance and management approach.

Development is the stage where ideas or prototypes are taken and engineered into real-life products, ready to be affordably mass-produced and able to meet reliability standards and fit into the existing infrastructure. It is by far and away the bulk of the R&D equation -- typically somewhere around 90 percent in the industries examined here. It is a sprawling, sweeping mass of a subject almost impossible to get one's arms around in a comprehensible way.

Research, by contrast, is where ideas are investigated, refined, and shaped into the beginnings of a new product, system, or process. Though it is an extremely small part of R&D, I've homed in on it because in the companies with a long-standing reputation for innovation -- those that continually blaze trails and create new industries -- invariably it is the research side that lights the way into the future. Although they often work on small-scale improvements to specific products, central research arms also concern themselves with the farther-out problems that often apply to different businesses and product lines. They garner a plurality of patents -- usually the more important ones -- and win the occasional Nobel Prize. Even more than that, central research is the focal point of most debates surrounding R&D policy. So, perhaps, there is more about it that needs to be illuminated.

An additional widespread misperception lies in the idea that we progress linearly from science to technology, from research breakthrough to developing products and then the market. If this were true, then managing research might be far more straightforward than it really is. In actuality, science and technology constantly feed into each other on all sorts of levels. The semiconductors at the heart of modern computers trace their roots to advances in solid-state physics that gave birth to the transistor. Yet these breakthroughs owed a large debt to attempts to analyze and control the silicon and germanium crystals used as radar detectors during World War II. More recently, semiconductors have evolved technically -- becoming ever smaller, faster, and more powerful. But now these devices are becoming so small that researchers envision building chips on the atomic scale -- spurring places like Hewlett-Packard, IBM, and NEC to devote more resources to studies of basic quantum physics. Staying atop this interplay between science and technology, finding novel ways to channel the fruits of one into the other, and motivating people to do it, form key parts of the research challenge.

Still another common pitfall is that in considering and debating research issues we tend to tackle the subject in sweeping terms, as if the same rules and conditions apply to every industry. But the uses of science and technology actually vary widely. In chemicals, one basic patent -- say for nylon or Kevlar -- can spawn an industry. But more usual is the situation found in the telecommunications industry. Breakthroughs do occur, but advances typically hinge on evolutionary change, so that it takes a multitude of piecemeal improvements to add up to a revolution. From the research and development standpoint, then, invention must be laid on top of invention -- oftentimes other people's inventions. And with companies today operating research labs on several continents, managing this give-and-take across geographic and cultural chasms can be a daunting challenge.

Not to beat a dead horse, but a final dangerous practice comes in reading too much into the numbers. The general feeling seems to be that more research is better research. Therefore, when corporate R&D budgets declined sharply in the early 1990s (they have risen dramatically since 1994), it was perceived as a crisis. However, in some regards this was simply a spending readjustment after the boom times of the 1960s and 1970s -- when it was easier to increase funding across the board -- and therefore better reflective of reality. It also illustrates a refocusing of some resources. In the computer industry, for example, as hardware lines mature software has emerged as the driving force in sales and profits. AT&T recognized the trend back in 1990, when it slashed research and development spending 8 percent. Part of those cuts stemmed from shifting its focus from the infrastructure-heavy physical sciences to more streamlined software research. But while the move made sense, the pundits didn't much care. The cuts were viewed almost as a national tragedy, even though in AT&T's view it was doing what it was supposed to do: adapting to reflect the real world.

Even if Bell Labs slashed its budget, so what? Merely spending money is no guarantee for success -- and can even be a sign of poor R&D planning. IBM's research and development outlays were once greater than the sum of the R&D budgets of its next dozen or so largest competitors -- including DEC, Hewlett-Packard, Hitachi, and Fujitsu -- yet Big Blue was far from a sure bet in the marketplace. Big budgets can mean the research pie is spread too thin, or locked on the wrong target, so slashing them actually can be a positive sign that a company is tightening its focus.

To provide a more accurate portrait of the research endeavor, and to unearth the secrets of standout management, I have profiled labs in Europe, Japan, and the United States. Along the way, I have visited with everyone from top policymakers to individual scientists, from long-time veterans to fresh hires. And I have looked at projects running the gamut from creating a better electric range to fashioning transistors from individual atoms. In this way, I hope to illustrate a range of actions and strategies relevant to the debate, while also bringing readers inside the labs and showcasing the often exciting technologies making their way, ever faster, toward the market, the office, and the living room.

The nine companies profiled have been selected on the basis of extensive background reading, site visits, and scores of interviews as being among the world's best in the computer-telecommunications-electronics sectors. Although Siemens and NEC mark the only corporations examined that are based outside the United States, nearly all the companies operate labs on multiple continents. All told, I visited more than two dozen facilities in Asia, Europe, and the U.S., including several run by firms that were not profiled but still contributed greatly to my overall understanding of how research operates.

In many cases, I try to illustrate management philosophies through descriptions of individual projects. I look mainly at success stories. However, the research organization that does not have a fairly significant portion of failed projects is probably not pushing the envelope enough. Therefore, I have also included examples of some seeming failures -- including IBM's well-over-$100-million effort to develop a revolutionary class of computers based on Josephson junction principles and Microsoft's Talisman technology for rendering high-quality PC graphics -- as well as a fresh look at the lessons learned from Xerox's legendary "fumble" of the pioneering advances in personal computing made at its famous Palo Alto Research Center (PARC).

Because central research arms are so big -- often more than a thousand people -- I have focused on projects and organizational issues that stand out as unique or especially evocative of the management philosophy. General Electric's research arm, for example, is unusual for leading one aspect of the company's Six Sigma quality initiative: a program called Design for Six Sigma. The idea is that achieving the highest quality products cannot be guaranteed only through traditional manufacturing initiatives, but must be built into products from the get-go -- and that often means starting in the research lab. At Lucent Technologies, the focus is on Bell Labs' famous Physical Research Laboratory, home to its farthest-out projects -- everything from new kinds of lasers to mapping the dark matter of the universe. The Xerox profile concentrates largely on the role of anthropology at PARC.

The book is divided into three main sections. The first sets the table, surveying the global situation and then laying out the basically optimistic theme -- that industrial innovation continues with more vigor than is typically suspected. To more fully develop this picture, I have tried to place modern corporate research in its rarely understood historical context -- a vital framework for understanding change and distinguishing fundamental shifts from periodic swings of the pendulum. This task involves tracing the rise of industrial research, from its origins in the German dye industry of the late 1800s through the establishment of GE's pioneering facility in 1900 and the Bell Telephone Laboratories on New Year's Day 1925. It chronicles the tremendous influence of World War II on today's labs and management style, and shows how the first two postwar decades -- a time of unprecedented American hegemony -- warped the common view of what corporate research should be, a view that ended abruptly with the resurgence of European firms and the rise of the Japanese. The section concludes with a detailed examination of the reasons behind what former HP Labs director Joel Birnbaum once termed the "research bloodbath" that hit labs in the late 1980s and early 1990s, with an eye on newly evolving strategies for harnessing and encouraging research in the twenty-first century.

The second part consists of detailed looks inside the research operations of three electronics and computer giants. One is IBM, arguably the world's most powerful corporate research house when it comes to the physical sciences. Of Big Blue's eight research arms, I have visited four -- the main Thomas J. Watson Research Center in Yorktown Heights, New York, and three satellites -- in Zurich, Tokyo, and San Jose. And I have discussed the work at the other four, in Delhi, Tel Aviv, Beijing, and Austin, Texas.

The remaining two firms are staunch, longtime competitors from different continents -- Siemens in Germany, the largest research spender outside the United States, and NEC in Japan, which has shown the most dramatic gains of all Japanese firms in its patent portfolio during the 1990s while also swimming against the tide by pursuing an exciting array of basic studies -- both at home and at its unique lab in Princeton, New Jersey. Like IBM, both Siemens and NEC maintain facilities and laboratories around the world -- and I have visited the key sites and spoken to all the top research leaders.

The book's final section consists of three additional chapters, each containing shorter profiles of two companies paired together either for their historical connection, intense competition, or some combination of these factors.

The Pioneers chapter looks at General Electric and Bell Labs. Their venerable laboratories helped define the institution of corporate research early in the century and competed heavily for decades in the infant days of radio and telephony, but have since drifted onto separate paths as their companies evolved apart. Both have endured major trauma and make-overs in recent years, but have emerged as leading centers of innovation. The next chapter, Children of the Sixties, examines the central research arms of Xerox and Hewlett-Packard, two much newer organizations that arose during the heyday of science in the 1960s. These two enterprises followed a path diametrically opposed to that taken by GE and Bell Labs: they did not compete for more than two decades but have recently become fierce antagonists in copiers, facsimile machines, laser printers, and general office technology. Finally, The New Pioneers offers the first detailed looks inside the upstart research labs of Intel and Microsoft. These modern-day giants followed the path of the original Pioneers by waiting until achieving dominance in their industries before creating long-range research ventures: both opened labs in the 1990s that are seeking to shape the future of personal computing.

I am not, nor have I ever been while researching and writing this book, an investor in any of the firms I have profiled. I have not invested in or been associated with any of their direct competitors -- nor have I ever served as an employee or consultant to any of them. In July 1999, I participated in a Bell Labs panel for its Global Science Scholars Summit, for which I received an honorarium. Otherwise, the extent of my financial indebtedness to them runs to a few meals and one tee-shirt. However, I am greatly beholden to all these companies for their open cooperation, as well as their willingness to discuss complicated issues and patiently answer all my follow-up questions in an effort to cultivate a deeper understanding of managing innovation in modern times.

Although I have covered business issues and technology for the better part of two decades, I have never been a particular fan of corporate management or practices. I have always approached the companies I cover with a somewhat wary, even adversarial, eye -- one that recognizes they are striving to put their best foot forward and are not about to tell an outsider complete details of their future projects and plans. All that aside, however, I learned a lot. Writing a book is far different from banging out a magazine or newspaper article. Because of my project's long time frame -- nearly three years all told -- I was able to visit top managers time and time again and really listen to and test what they were saying, luxuries virtually impossible with normal news deadlines and a job that requires constant flitting between subjects. The same held true when dealing with the researchers. In some cases, I followed their work for two years, checking things periodically to stay on top of developments.

The overall picture is one of optimism -- not in the sense that research operations are wonderful entities that don't make mistakes, or even that those profiled here will stay atop their fields. Rather, I'm optimistic in the broader sense -- that no matter the state of any individual firm, the enterprise as a whole is advancing. Progress is not always pretty, or even steady. But as a rule, the best corporations learn from their mistakes and strive continuously to improve the innovation machine, whipping inventions into shape faster and tailoring them to better meet customer needs -- while still maintaining the balance between so-called incremental improvements in existing products and more pathbreaking work that can create new lines of business, even entire industries.

This last issue, concerning the balance between short-term and longer-range studies, basic or strategic research and applied projects, lies at the heart of much of the debate over the direction of industrial research today. It comes up repeatedly throughout the book. Basic research has always been a small part of corporate labs, despite all the hullabaloo. Nevertheless, my finding of health flies in the face of conventional perception.

This is not to deny that corporate labs have undergone a major transformation in recent years. Arno Penzias, who led Bell Labs through its darkest days in the early 1990s, says the heyday of basic industrial research -- at least in the physical sciences -- actually came in the 1960s and 1970s, around the time "U.S. cars lost their fins." Since then the field has undergone a series of ups and downs that include the dramatic cutbacks of the late 1980s and early 1990s, followed by a slow and cautious rebirth in recent years. It's hardly like the old days, though. Research arms have become a lot more focused on corporate needs and goals. As with all other aspects of business today, they are judged increasingly by productivity and quality measures -- and must be accountable for their actions.

All this is necessary and good. But while facing this reality, the best companies -- and all those profiled in this book -- remain keenly aware that without a program of bold, longer-term research they run the risk of falling dramatically behind and never catching up. As Charles Kettering said when talking about his change-making department: "The Lord has given a fellow the right to choose the kind of troubles he will have. He can have either those that go with being a pioneer or those that go with being a trailer."

This book is mainly about the pioneers -- old and new -- who chose the troubles that go with managing innovation on a variety of time frames, including those far beyond the horizon. They provide the framework and engines of tomorrow.

Copyright © 2000 by Robert Buderi

Table of Contents


Contents

Introduction: Change

ONE A Matter of Death and Life

TWO The Invention of Invention

THREE Houses of Magic

FOUR Out of the Plush-lined Rut

FIVE IBM: Taking the Asylum

SIX House of Siemens

SEVEN NEC: Balancing East and West

EIGHT The Pioneers: General Electric and Bell Labs

NINE Children of the Sixties: Xerox and Hewlett-Packard

TEN The New Pioneers: Intel and Microsoft

Conclusion: The Innovation Marathon

Appendix

Notes

Interviews

Bibliography

Index

Introduction

Introduction: Change

"The value and even the mark of true science consists, in my opinion, in the useful inventions which can be derived from it."

— G. W. LEIBNIZ


"Science today is everybody's business."

— I. BERNARD COHEN


The prolific rags-to-riches inventor Charles F. Kettering, backbone of the mighty General Motors research machine that gave the world four-wheel brakes and the refrigerant Freon, liked to call a company's research house its "change-making department." In a 1929 speech to members of the United States Chamber of Commerce, he put it this way: "I am not pleading with you to make changes. I am telling you you have got to make them — not because I say so, but because old Father Time will take care of you if you don't change. Advancing waves of other people's progress sweep over the unchanging man and wash him out. Consequently, you need to organize a department of systematic change-making."

Corporate change-making, in the form of a company's central research operation, is the focus of this book. As a pivotal force behind a plethora of key industries, research can be crucial not only to winning in the marketplace but to national economic viability. Yet serious discussion of the subject — examining the nature and evolution of successful research, and who still manages to pull it off — has been missing from virtually every popular management treatise in vogue today. This failure has become all the more glaring amidst the sweeping cutbacks in research spending that marked the early and mid-1990s. The upheaval has led to the widespread perception — among Congress, the President, leading policymakers, academics, and the press — that companies have almost unilaterally, and shortsightedly, pulled back from far-ranging and potentially revolutionary studies to concentrate on incremental improvements to existing products.

This book challenges that perception by throwing open the spotlight on corporate research — laying out its history, identifying the real issues, and delving inside the labs of the very best practitioners to illuminate the various approaches to managing innovation in today's quicksilver global economy. The focus is on the rapidly blurring electronics-computer-telecommunications industries. It can be argued that especially with the rise of the Internet, which is bringing together voice, data, telephony, computers, and a variety of office technologies, these fields impact our daily lives more profoundly, or at least far more visibly, than other industries. But more than that, these sectors have undergone the most upheaval in recent years, in the process becoming the focal point of the vigorous public debate about research and development policy and the future of competitiveness. The hard choices companies have made — about spending, targeting resources, and managing innovation across geographic and cultural boundaries — hold crucial lessons for what lies ahead in a fast-changing world.

Industrial research first appeared in the 1870s and quickly took its place as a hallmark of the industrial age. Companies learned that, when successful, systematic research could provide a huge edge. It helped fight fires, protect core business lines in a myriad of small and often hidden ways, and even create powerful new industries — like the radio, wireless communications, and television empires that grew out of early twentieth- century electrical investigations at General Electric and telephony studies at American Telephone & Telegraph.

Taken as a whole, these innovations helped forge a strong sense, not just from Charles Kettering, but many other scions of industry and government, that corporate research could play a vital role in competitiveness. But despite a widely accepted view that research was important to corporate well-being and vigor, it wasn't until the 1950s and early 1960s that things really took off. That was when young Massachusetts Institute of Technology assistant professor Robert Solow showed mathematically — the basis for his 1987 Nobel Prize — that in the long run growth in gross national product per worker is not due so much to capital investment, the traditional pillar of classic economic analyses. Instead, such growth is in very large part the result of technological progress. In a nutshell, advanced economies grow largely on the strength of what one can call New Knowledge — in most cases the high-technology fruits of research and development.

Although it was not high in the minds of research leaders, Solow's work debuted amidst the Cold War and the resulting surge in government and industrial R&D spending that helped shift inventive activity away from straightforward mechanical engineering to an affair deeply rooted in science — primarily physics and chemistry. With that transformation came a dramatic increase in the growth of corporate labs, to the point that the Washington-based Industrial Research Institute now counts close to 15,000 corporate labs in the United States alone. These organizations employ an estimated 750,000 scientists and engineers — some 70 percent of the total number of these professionals — and spend roughly $150 billion annually on research and development. All told, industry funds about 65 percent of the R&D conducted in the country, and performs closer to three-quarters. And this is only part of the story. In rough terms, the U.S. spending is matched by the rest of the world combined — meaning the saga of corporate research today is truly a worldwide tale.

The research story goes far beyond corporations, of course. The drive for successful innovation encompasses government labs, universities, and even still the basement inventor — with the outcome dependent on a staggering array of factors from grade school education to state and federal tax credits, national R&D spending policies, and patent laws. However, companies reside on the leading edge of this ongoing, global struggle. They are the prime venue where New Knowledge is converted into Useful Products, and where success and failure can be most plainly gauged in terms of patents, market share, sales, stock prices, and the like. By focusing on this front, I hope to provide deep insights into invention and innovation today — from the trials and tribulations of management to what's coming down the pike, and who has the edge in key technology-driven industries that will shape the way we live in the twenty-first century. To do this, I've gone inside the labs of some of the biggest names in innovation on three continents: IBM, Siemens, NEC, General Electric, Lucent Technologies, Xerox, Hewlett-Packard, Intel, and Microsoft.

Of course, there is absolutely no assurance that the companies, or even the nations, whose labs generate key inventions are the ones who will reap the economic rewards. On the back of key discoveries, Britain served as the initial center of the chemical industry; but it was the Germans who rose to control global markets for the half century leading up to the First World War. Similarly, the United States became a leading economic power long before it achieved its present position of research and scientific dominance. The annals of industrial research history are rife with examples of usurpers riding to glory on the waves of other people's innovations.

Marketing, sales, service, packaging, distribution, manufacturing, and development, the close sibling of research, all remain vital cogs in the wheel of corporate fortune. Which is most important? Well, it's like the case of the drunken juggler, notes John A. Armstrong, a former IBM vice president for science and technology. "Let's say the juggler has two tennis balls and a flaming sword, and a club," he supposes. "Now you say, which one of those things is most important? Well the thing that's most important is the next one he's got to catch. And success is keeping them all up in the air."

The point is that a host of other factors can outstrip the benefits of even the strongest research program. Yet if wielded effectively, research can sharpen the vision of an otherwise all-too-hazy future, spurring innovation and raising the chances for long-term success. This book focuses on the management philosophies, funding paradigms, incentive programs, and all the rest employed by the best labs to do just that. There is no single formula; people have found many paths to creating and maintaining vital research organizations. Which one is most successful depends on such factors as the economic times, the industry, and the firm's role in it — as well as individual corporate culture.

Indeed, fitting into that culture — not just of the research organization but of the entire company — and helping scientists and engineers see themselves as part of a larger whole, turns out to be one of the keys to successful industrial research. Some company research arms, like Hewlett-Packard's, possess a strong sense of history and a deep connection to the rest of the organization — an enviable situation made easier because the corporation itself is only fifty years old, and by the fact that the founders and all the chief executives until Carly Fiorina took over in mid-1999 have been engineers. An outfit such as Bell Labs research, by contrast, nurtures an equally strong culture but became increasingly disconnected from the rest of the company, partly as a result of the sweeping success of its scientific investigations that transformed a corporate laboratory into a university-like institution. Combined with years of uncertainty over the future of AT&T itself, the result was a kind of slow death that was only averted by managers taking drastic action. Even then it wasn't until the core of Bell Labs found a new home in a separate company, Lucent Technologies, that it rekindled the old spirit of innovation and reasserted its standing as one of the world's great industrial laboratories.

In many ways, the book is an extension of my work as BusinessWeek's technology editor, where I oversaw coverage of corporate labs worldwide and planned and wrote major stories on innovation and research strategy, including those involving the now-defunct R&D Scoreboard, which tracked the research and development expenditures of individual firms in a variety of industries. But it is also greatly influenced by my research for a previous book, The Invention That Changed the World, which told the story of World War II radar development at M.I.T.'s top-secret Radiation Laboratory.

In a sense, the Rad Lab was the first Manhattan project. The nation's top physicists were recruited to the enterprise, mixing with engineers, mathematicians, and even biologists and astronomers, to develop radar systems that had a dramatic effect on the course of the war. Even more than that, though, the radar work played a critical role in the evolution of postwar science and technology. Besides having a direct impact on specific landmark creations and discoveries — including the transistor, nuclear magnetic resonance, wireless communications, microwave ovens, radio astronomy, and the maser and laser — the radar effort helped spark a whole new attitude about the management of research as a fast-paced, collaborative, and cross-disciplinary affair. Seeing this crucial aspect of the evolution of corporate research greatly increased my insight into what went on in corporate labs throughout the 1990s. Indeed, a lack of historical and evolutionary focus has far too often caused corporate research to be viewed simplistically. Many of the common perceptions are old, or wrong — and myths and easy stereotypes abound about the innovation process, as well as the very role and importance of research in modern corporations.

One of the most fundamental of these misconceptions is the treatment of research and development as if they were one thing — namely, an enterprise called R&D. This is an easy trap to fall into, because spending on these two variables is always tracked together, and the goal of nearly every corporation is to fuse them better, so that products flow more quickly out of the lab into the marketplace. Nevertheless, the two are vastly different — in both substance and management approach.

Development is the stage where ideas or prototypes are taken and engineered into real-life products, ready to be affordably mass-produced and able to meet reliability standards and fit into the existing infrastructure. It is by far and away the bulk of the R&D equation — typically somewhere around 90 percent in the industries examined here. It is a sprawling, sweeping mass of a subject almost impossible to get one's arms around in a comprehensible way.

Research, by contrast, is where ideas are investigated, refined, and shaped into the beginnings of a new product, system, or process. Though it is an extremely small part of R&D, I've homed in on it because in the companies with a long-standing reputation for innovation — those that continually blaze trails and create new industries — invariably it is the research side that lights the way into the future. Although they often work on small-scale improvements to specific products, central research arms also concern themselves with the farther-out problems that often apply to different businesses and product lines. They garner a plurality of patents — usually the more important ones — and win the occasional Nobel Prize. Even more than that, central research is the focal point of most debates surrounding R&D policy. So, perhaps, there is more about it that needs to be illuminated.

An additional widespread misperception lies in the idea that we progress linearly from science to technology, from research breakthrough to developing products and then the market. If this were true, then managing research might be far more straightforward than it really is. In actuality, science and technology constantly feed into each other on all sorts of levels. The semiconductors at the heart of modern computers trace their roots to advances in solid-state physics that gave birth to the transistor. Yet these breakthroughs owed a large debt to attempts to analyze and control the silicon and germanium crystals used as radar detectors during World War II. More recently, semiconductors have evolved technically — becoming ever smaller, faster, and more powerful. But now these devices are becoming so small that researchers envision building chips on the atomic scale — spurring places like Hewlett-Packard, IBM, and NEC to devote more resources to studies of basic quantum physics. Staying atop this interplay between science and technology, finding novel ways to channel the fruits of one into the other, and motivating people to do it, form key parts of the research challenge.

Still another common pitfall is that in considering and debating research issues we tend to tackle the subject in sweeping terms, as if the same rules and conditions apply to every industry. But the uses of science and technology actually vary widely. In chemicals, one basic patent — say for nylon or Kevlar — can spawn an industry. But more usual is the situation found in the telecommunications industry. Breakthroughs do occur, but advances typically hinge on evolutionary change, so that it takes a multitude of piecemeal improvements to add up to a revolution. From the research and development standpoint, then, invention must be laid on top of invention — oftentimes other people's inventions. And with companies today operating research labs on several continents, managing this give-and-take across geographic and cultural chasms can be a daunting challenge.

Not to beat a dead horse, but a final dangerous practice comes in reading too much into the numbers. The general feeling seems to be that more research is better research. Therefore, when corporate R&D budgets declined sharply in the early 1990s (they have risen dramatically since 1994), it was perceived as a crisis. However, in some regards this was simply a spending readjustment after the boom times of the 1960s and 1970s — when it was easier to increase funding across the board — and therefore better reflective of reality. It also illustrates a refocusing of some resources. In the computer industry, for example, as hardware lines mature software has emerged as the driving force in sales and profits. AT&T recognized the trend back in 1990, when it slashed research and development spending 8 percent. Part of those cuts stemmed from shifting its focus from the infrastructure-heavy physical sciences to more streamlined software research. But while the move made sense, the pundits didn't much care. The cuts were viewed almost as a national tragedy, even though in AT&T's view it was doing what it was supposed to do: adapting to reflect the real world.

Even if Bell Labs slashed its budget, so what? Merely spending money is no guarantee for success — and can even be a sign of poor R&D planning. IBM's research and development outlays were once greater than the sum of the R&D budgets of its next dozen or so largest competitors — including DEC, Hewlett-Packard, Hitachi, and Fujitsu — yet Big Blue was far from a sure bet in the marketplace. Big budgets can mean the research pie is spread too thin, or locked on the wrong target, so slashing them actually can be a positive sign that a company is tightening its focus.

To provide a more accurate portrait of the research endeavor, and to unearth the secrets of standout management, I have profiled labs in Europe, Japan, and the United States. Along the way, I have visited with everyone from top policymakers to individual scientists, from long-time veterans to fresh hires. And I have looked at projects running the gamut from creating a better electric range to fashioning transistors from individual atoms. In this way, I hope to illustrate a range of actions and strategies relevant to the debate, while also bringing readers inside the labs and showcasing the often exciting technologies making their way, ever faster, toward the market, the office, and the living room.

The nine companies profiled have been selected on the basis of extensive background reading, site visits, and scores of interviews as being among the world's best in the computer-telecommunications-electronics sectors. Although Siemens and NEC mark the only corporations examined that are based outside the United States, nearly all the companies operate labs on multiple continents. All told, I visited more than two dozen facilities in Asia, Europe, and the U.S., including several run by firms that were not profiled but still contributed greatly to my overall understanding of how research operates.

In many cases, I try to illustrate management philosophies through descriptions of individual projects. I look mainly at success stories. However, the research organization that does not have a fairly significant portion of failed projects is probably not pushing the envelope enough. Therefore, I have also included examples of some seeming failures — including IBM's well-over-$100-million effort to develop a revolutionary class of computers based on Josephson junction principles and Microsoft's Talisman technology for rendering high-quality PC graphics — as well as a fresh look at the lessons learned from Xerox's legendary "fumble" of the pioneering advances in personal computing made at its famous Palo Alto Research Center (PARC).

Because central research arms are so big — often more than a thousand people — I have focused on projects and organizational issues that stand out as unique or especially evocative of the management philosophy. General Electric's research arm, for example, is unusual for leading one aspect of the company's Six Sigma quality initiative: a program called Design for Six Sigma. The idea is that achieving the highest quality products cannot be guaranteed only through traditional manufacturing initiatives, but must be built into products from the get-go — and that often means starting in the research lab. At Lucent Technologies, the focus is on Bell Labs' famous Physical Research Laboratory, home to its farthest-out projects — everything from new kinds of lasers to mapping the dark matter of the universe. The Xerox profile concentrates largely on the role of anthropology at PARC.

The book is divided into three main sections. The first sets the table, surveying the global situation and then laying out the basically optimistic theme — that industrial innovation continues with more vigor than is typically suspected. To more fully develop this picture, I have tried to place modern corporate research in its rarely understood historical context — a vital framework for understanding change and distinguishing fundamental shifts from periodic swings of the pendulum. This task involves tracing the rise of industrial research, from its origins in the German dye industry of the late 1800s through the establishment of GE's pioneering facility in 1900 and the Bell Telephone Laboratories on New Year's Day 1925. It chronicles the tremendous influence of World War II on today's labs and management style, and shows how the first two postwar decades — a time of unprecedented American hegemony — warped the common view of what corporate research should be, a view that ended abruptly with the resurgence of European firms and the rise of the Japanese. The section concludes with a detailed examination of the reasons behind what former HP Labs director Joel Birnbaum once termed the "research bloodbath" that hit labs in the late 1980s and early 1990s, with an eye on newly evolving strategies for harnessing and encouraging research in the twenty-first century.

The second part consists of detailed looks inside the research operations of three electronics and computer giants. One is IBM, arguably the world's most powerful corporate research house when it comes to the physical sciences. Of Big Blue's eight research arms, I have visited four — the main Thomas J. Watson Research Center in Yorktown Heights, New York, and three satellites — in Zurich, Tokyo, and San Jose. And I have discussed the work at the other four, in Delhi, Tel Aviv, Beijing, and Austin, Texas.

The remaining two firms are staunch, longtime competitors from different continents — Siemens in Germany, the largest research spender outside the United States, and NEC in Japan, which has shown the most dramatic gains of all Japanese firms in its patent portfolio during the 1990s while also swimming against the tide by pursuing an exciting array of basic studies — both at home and at its unique lab in Princeton, New Jersey. Like IBM, both Siemens and NEC maintain facilities and laboratories around the world — and I have visited the key sites and spoken to all the top research leaders.

The book's final section consists of three additional chapters, each containing shorter profiles of two companies paired together either for their historical connection, intense competition, or some combination of these factors.

The Pioneers chapter looks at General Electric and Bell Labs. Their venerable laboratories helped define the institution of corporate research early in the century and competed heavily for decades in the infant days of radio and telephony, but have since drifted onto separate paths as their companies evolved apart. Both have endured major trauma and make-overs in recent years, but have emerged as leading centers of innovation. The next chapter, Children of the Sixties, examines the central research arms of Xerox and Hewlett-Packard, two much newer organizations that arose during the heyday of science in the 1960s. These two enterprises followed a path diametrically opposed to that taken by GE and Bell Labs: they did not compete for more than two decades but have recently become fierce antagonists in copiers, facsimile machines, laser printers, and general office technology. Finally, The New Pioneers offers the first detailed looks inside the upstart research labs of Intel and Microsoft. These modern-day giants followed the path of the original Pioneers by waiting until achieving dominance in their industries before creating long-range research ventures: both opened labs in the 1990s that are seeking to shape the future of personal computing.

I am not, nor have I ever been while researching and writing this book, an investor in any of the firms I have profiled. I have not invested in or been associated with any of their direct competitors — nor have I ever served as an employee or consultant to any of them. In July 1999, I participated in a Bell Labs panel for its Global Science Scholars Summit, for which I received an honorarium. Otherwise, the extent of my financial indebtedness to them runs to a few meals and one tee-shirt. However, I am greatly beholden to all these companies for their open cooperation, as well as their willingness to discuss complicated issues and patiently answer all my follow-up questions in an effort to cultivate a deeper understanding of managing innovation in modern times.

Although I have covered business issues and technology for the better part of two decades, I have never been a particular fan of corporate management or practices. I have always approached the companies I cover with a somewhat wary, even adversarial, eye — one that recognizes they are striving to put their best foot forward and are not about to tell an outsider complete details of their future projects and plans. All that aside, however, I learned a lot. Writing a book is far different from banging out a magazine or newspaper article. Because of my project's long time frame — nearly three years all told — I was able to visit top managers time and time again and really listen to and test what they were saying, luxuries virtually impossible with normal news deadlines and a job that requires constant flitting between subjects. The same held true when dealing with the researchers. In some cases, I followed their work for two years, checking things periodically to stay on top of developments.

The overall picture is one of optimism — not in the sense that research operations are wonderful entities that don't make mistakes, or even that those profiled here will stay atop their fields. Rather, I'm optimistic in the broader sense — that no matter the state of any individual firm, the enterprise as a whole is advancing. Progress is not always pretty, or even steady. But as a rule, the best corporations learn from their mistakes and strive continuously to improve the innovation machine, whipping inventions into shape faster and tailoring them to better meet customer needs — while still maintaining the balance between so-called incremental improvements in existing products and more pathbreaking work that can create new lines of business, even entire industries.

This last issue, concerning the balance between short-term and longer-range studies, basic or strategic research and applied projects, lies at the heart of much of the debate over the direction of industrial research today. It comes up repeatedly throughout the book. Basic research has always been a small part of corporate labs, despite all the hullabaloo. Nevertheless, my finding of health flies in the face of conventional perception.

This is not to deny that corporate labs have undergone a major transformation in recent years. Arno Penzias, who led Bell Labs through its darkest days in the early 1990s, says the heyday of basic industrial research — at least in the physical sciences — actually came in the 1960s and 1970s, around the time "U.S. cars lost their fins." Since then the field has undergone a series of ups and downs that include the dramatic cutbacks of the late 1980s and early 1990s, followed by a slow and cautious rebirth in recent years. It's hardly like the old days, though. Research arms have become a lot more focused on corporate needs and goals. As with all other aspects of business today, they are judged increasingly by productivity and quality measures — and must be accountable for their actions.

All this is necessary and good. But while facing this reality, the best companies — and all those profiled in this book — remain keenly aware that without a program of bold, longer-term research they run the risk of falling dramatically behind and never catching up. As Charles Kettering said when talking about his change-making department: "The Lord has given a fellow the right to choose the kind of troubles he will have. He can have either those that go with being a pioneer or those that go with being a trailer."

This book is mainly about the pioneers — old and new — who chose the troubles that go with managing innovation on a variety of time frames, including those far beyond the horizon. They provide the framework and engines of tomorrow.

Copyright © 2000 by Robert Buderi

Customer Reviews

Most Helpful Customer Reviews

See All Customer Reviews