- Shopping Bag ( 0 items )
Ships from: Hastings, MI
Usually ships in 1-2 business days
In this completely revised and updated edition of The Chip, T.R. Reid tells the gripping adventure story of their invention and of its growth into a global information industry. This is the story of how the digital age began.
THE MONOLITHIC IDEA
The idea occurred to Jack Kilby at the height of summer, when everyone else was on vacation and he had the lab to himself. It was an idea, as events would prove, of literally cosmic dimensions, an idea that would be honored in the textbooks with a name of its own: the monolithic idea. The idea would eventually win Kilby the Nobel Prize in Physics. This was slightly anomalous, because Jack had no training whatsoever in physics; the Royal Swedish Academy of Sciences was willing to overlook that minor detail because Jack's idea did, after all, change the daily life of almost everyone on earth for the better. But all that was in the future. At the time Kilby hit on the monolithic idea-it was July 1958-he only hoped that his boss would let him build a model and give the new idea a try.
The boss was still an unknown quantity. It had been less than two months since Jack Kilby arrived in Dallas to begin work at Texas Instruments, and the new employee did not yet have a firm sense of where he stood. Jack had been delighted and flattered when Willis Adcock, the famous silicon pioneer, had offered him a job at TI's semiconductor research group. It was just about the first lucky break of Jack Kilby's career; he would be working for one of the most prominent firms in electronics, with the kind of colleagues and facilities that could help a hard-working young
engineer solve important problems. Still, the pleasure was tempered with some misgivings. Jack's wife, Barbara, and their two young daughters had been happy in Milwaukee, and Jack's career had blossomed there. In a decade working at a small electronics firm called Centralab,Kilby had made twelve patentable inventions (including the reduced titanate capacitor and the steatite-packaged transistor). Each patent brought a small financial bonus from the firm and a huge feeling of satisfaction. Indeed, Jack said later that the most important discovery he made at Centralab was the sheer joy of inventing. It was problem solving, really: you identified the problem, worked through 5 or 50 or 500 possible approaches, found ways to circumvent the limits that nature had built into materials and forces, and perfected the one solution that worked. It was an intense, creative process, and Jack loved it with a passion. It was that infatuation with problem solving that had lured him, at the age of thirty-four, to take a chance on the new job in Dallas. Texas Instruments was an important company, and it was putting him to work on the most important problem in electronics.
By the late 1950s, the problem-the technical journals called it "the interconnections problem" or "the numbers barrier" or, more poetically, "the tyranny of numbers"-was a familiar one to the physicists and engineers who made up the electronics community. But it was still a secret to the rest of the world. In the 1950s, before Chernobyl, before the Challenger rocket blew up, before the advent of Internet porn or cell phones that ring in the middle of the opera, the notion of "technological progress" still had only positive connotations. Americans were looking ahead with happy anticipation to a near future when all the creations of science fiction, from Dick Tracy's wrist radio to Buck Rogers's air base on Mars, would become facts of daily life. Already in 1958 you could pull a transistor radio out of your pocket-a radio in your pocket!-and hear news of a giant electronic computer that was receiving signals beamed at the speed of light from a miniaturized transmitter in a man-made satellite orbiting the earth at 18,000 miles per hour. Who could blame people for expecting new miracles tomorrow?
There was an enormous appetite for news about the future, an appetite that magazines and newspapers were happy to feed. The major breakthroughs in biology, genetics, and medicine were still a few years away, but in electronics, the late fifties saw some marvelous innovation almost every month. First came the transistor, the invention that gave birth to the new electronic age-and then there was the tecnetron, the spacistor, the nuvistor, the thyristor. It hardly seemed remarkable when the venerable British journal New Scientist predicted the imminent development of a new device, the "neuristor," which would perform all the functions of a human neuron and so make possible the ultimate prosthetic device-the artificial brain. Late in 1956 a Life magazine reporter dug out a secret Pentagon plan for a new kind of missile-a troop-carrying missile that could pick up a platoon at a base in the United States and then "loop through outer space and land the troops 500 miles behind enemy lines in less than 30 minutes." A computer in the missile's nose cone would assure the pinpoint accuracy required to make such flights possible. A computer in a nose cone? That was a flight of fancy in itself. The computers of the 1950s were enormous contraptions that filled whole rooms-in some cases, whole buildings-and consumed the power of a
locomotive. But that, too, would give way to progress. Sperry-Rand, the maker of UNIVAC, the computer that had leaped to overnight fame on November 4, 1952, when it predicted Dwight Eisenhower's electoral victory one hour after the polls closed, was said to be working on computers that would fit on a desktop. And that would be just the beginning. Soon enough there would be computers in a briefcase, computers in a wristwatch, computers on the head of a pin.
Jack Kilby and his colleagues in the electronics business-the people who were supposed to make all these miracles come true-read the articles with a rueful sense of amusement. There actually were plans on paper to implement just about every fantasy the popular press reported; there were, indeed, preliminary blueprints that went far beyond the popular imagination. Engineers were already making their first rough plans for high-capacity computers that could steer a rocket to the moon or connect every library in the world to a single worldwide web accessible from any desk. But it was all on paper. It was all impossible to produce because of the limitation posed by the tyranny of numbers. The interconnections problem stood as an impassable barrier blocking all future progress in electronics.
And now, on a muggy summer's day in Dallas, Jack Kilby had an idea that might break down the barrier. Right from the start, he thought he might be on to something revolutionary, but he did his best to retain a professional caution. A lot of revolutionary ideas, after all, turn out to have fatal flaws. Day after day, working alone in the empty lab, he went over the idea, scratching pictures in his lab notebook, sketching circuits, planning how he might build a model. As an inventor, Jack knew that a lot of spectacular ideas fall to pieces if you look at them too hard. But this one was different: the more he studied it, the more he looked for flaws, the better it looked.
When his colleagues came back from vacation, Jack showed his notebook to Willis Adcock. "He was enthused," Jack wrote later, "but skeptical." Adcock remembers it the same way. "I was very interested," he recalled afterward. "But what Jack was saying, it was pretty damn cumbersome; you would have had a terrible time trying to produce it." Jack kept pushing for a test of the new idea. But a test would require a model; that could cost $10,000, maybe more. There were other projects around, and Adcock was supposed to move ahead on them.
Jack Kilby is a gentle soul, easygoing and unhurried. A lanky, casual, down-home type with a big leathery face that wraps around an enormous smile, he talks slowly, slowly in a quiet voice that has never lost the soft country twang of Great Bend, Kansas, where he grew up. That deliberate mode of speech reflects a careful, deliberate way of thinking. Adcock, in contrast, is a zesty sprite who talks a mileaminute and still can't keep up with his racing train of thought. That summer, though, it was Kilby who was pushing to race ahead. After all, if they didn't develop this new idea, somebody else might hit on it. Texas Instruments, after all, was hardly the only place in the world where people were trying to overcome the tyranny of numbers.
The monolithic idea occurred to Robert Noyce in the depth of winter-or at least in the mildly chilly season that passes for winter in the sunny valley of San Francisco Bay that is known today, because of that idea, as Silicon Valley. Unlike Kilby, Bob Noyce did not have to check with the boss when he got an idea; at the age of thirty-one, Noyce was the boss.
It was January 1959, and the valley was still largely an agricultural domain, with only a handful of electronics firms sprouting amid the endless peach and prune orchards. One of those pioneering firms, Fairchild Semiconductor, had been started late
in 1957 by a group of physicists and engineers who guessed-
correctly, as it turned out-that they could become fantastically rich by producing improved versions of transistors and other mechanical devices. The group was long on mechanical talent and short on managerial skills, but one of the founders turned out to have both: Bob Noyce. A slender, square-jawed man who exuded the easy self-assurance of a jet pilot, Noyce had an unbounded curiosity that led him, at one time or another, to take up hobbies ranging from madrigal singing to flying seaplanes. His doctorate was in physics, and his technical specialty was photolithography, an exotic process for printing circuit boards that required state-of-the-art knowledge of photography, chemistry, and circuit design. Like Jack Kilby, Noyce preferred to direct his powerful intelligence at specific problems that needed solving, and he shared with Kilby an intense sense of exhilaration when he found a way to leap over some difficult technical obstacle. At Fairchild, though, he also became fascinated with the discipline of management, and gravitated to the position of director of research and development. In that job, Noyce spent most of his time searching for profitable solutions to the problems facing the electronics industry. In the second half of the 1950s, that meant he was puzzling over things like the optimum alloy to use for base and emitter contacts in double-diffuse transistors, or efficient ways to passivate junctions within the silicon wafer. Those were specific issues involving the precise components Fairchild was producing at the time. But Noyce also gave some thought during the winter of 1958-59 to a much broader concern: the tyranny of numbers.
Unlike the quiet, introverted Kilby, who does his best work alone, thinking carefully through a problem, Noyce was an outgoing, loquacious, impulsive inventor who needed somebody to listen to his ideas and point out the ones that couldn't possibly work. That winter, Noyce's main sounding board was his friend Gordon Moore, a thoughtful, cautious physical chemist who was another cofounder of Fairchild Semiconductor. Noyce would barge into Moore's cubicle, full of energy and excitement, and start scrawling on the blackboard: "If we built a resistor here, and the transistor over here, then maybe you could . . ."
|1||The Monolithic Idea||3|
|2||The Will to Think||24|
|3||A Nonobvious Solution||62|
|4||Leap of Insight||81|
|5||Kilby v. Noyce||96|
|6||The Real Miracle||118|
|A Note about Sources||273|
Posted September 3, 2012
Subjects like this are potentially very dull. This one is anything but. The author does an excellent job of explaining the history of the integrated circuit, and the people who invented them, in a very clear language.
1 out of 1 people found this review helpful.Was this review helpful? Yes NoThank you for your feedback. Report this reviewThank you, this review has been flagged.
Posted February 13, 2011
No text was provided for this review.
Posted April 29, 2009
No text was provided for this review.