Digital Revolutionaries: The Men and Women Who Brought Computing to Life

Digital Revolutionaries: The Men and Women Who Brought Computing to Life

by Steve Lohr
Digital Revolutionaries: The Men and Women Who Brought Computing to Life

Digital Revolutionaries: The Men and Women Who Brought Computing to Life

by Steve Lohr

eBook

$11.99 

Available on Compatible NOOK Devices and the free NOOK Apps.
WANT A NOOK?  Explore Now

Related collections and offers


Overview

In the beginning, there was the computer. And it was big. As big as a room. Sometimes as big as a house. Early computers required teams of white-coated scientists to keep them running, yet one of those giant behemoths could not match the computing power of a single microchip today. From the first massive computers to today's nanotechnology, DIGITAL REVOLUTIONARIES offers a guided tour of the history of computers and a celebration of the human ingenuity that led the world from ENIAC to iMAC.


Product Details

ISBN-13: 9781429946285
Publisher: Roaring Brook Press
Publication date: 09/29/2009
Series: New York Times
Sold by: Macmillan
Format: eBook
Pages: 176
File size: 1 MB
Age Range: 10 - 14 Years

About the Author

STEVE LOHR, a senior writer and technology reporter for the NEW YORK TIMES, is a leading expert on the corporations that dominate computing today. In 1998, he was nominated for a Pulitzer Prize for his coverage of Microsoft. He lives in New York City.

Read an Excerpt

Digital Revolutionaries

The Men and Women Who Brought Computing to Life


By Steve Lohr

Roaring Brook Press

Copyright © 2009 The New York Times
All rights reserved.
ISBN: 978-1-59643-532-2



CHAPTER 1

New Geeks and Nemo


Ever wonder how computers work What goes on behind the screen and the keyboard? If so, you share important traits with the pioneers of the digital revolution. Curiosity and a sense of determination to push beyond the limits of the current technology are qualities these computer revolutionaries embody. Without their questioning minds, we might still have the room-sized computing machines of the 1940s instead of today's microcomputers. Cars might not have become the computers on wheels they are now, and we might not be able to download an MP3 file to a laptop and transfer it to an iPod. Would the iPod even exist? For every technological puzzle that begs to be solved, there is a computing "geek" who will try to solve it simply because it is there.

Long before people were called computer nerds, Charles Simonyi was one, as a teenager in Budapest in the 1960s. Things were very different back then. Computers didn't play music or movies. They were big machines, and they were scarce. In Communist Hungary, the government owned most computers and, through a family friend, Simonyi got an unpaid job in the computer center of a government agency. As a child, he had always been interested in mechanical things. He loved Erector sets, but parts were often hard to get in Hungary, so he built imaginary machines with the pieces he could get.

The government computer Simonyi worked on had no mouse and no screen with easy-touse graphic icons like the kind we point to and then click to make today's personal computers perform tasks. To instruct the Russian-made mainframe to do anything, Simonyi had to master the arcane art of speaking to the computer in a language understood by the machine: programming code. It was frustrating at first, but Simonyi had a real knack for it, and he was thrilled that he could program the computer to do everything from calculating statistical plans for the government to playing games of tic-tactoe and chess for fun. "In a way, I found my ultimate Erector set — an Erector set without limits," he recalled.

Simonyi was a strong-willed young man and an independent thinker. He despised the restraints on individual freedom under communism, and his computer skills would serve as his passport to the West. At seventeen, he left for a temporary job in Denmark, but he had no plans to return. His parents knew of his intention, and the consequences. He would not see his family again for more than twenty years. Simonyi arrived in Denmark with no money and few possessions. Still, he had a job and talents that would grow and become increasingly valuable over the years.

Simonyi emigrated to California as a student, attending the University of California at Berkeley and then earning a doctorate at Stanford University. He always worked doing computer jobs on the side to pay his way. In the 1970s, he was a researcher at the Xerox Palo Alto Research Center, a hothouse of ideas that later found their way into successful products.

In 1980, Simonyi showed up at a fledgling company outside Seattle: Microsoft. His reputation and work at the Xerox research center got him an immediate job offer from Bill Gates, and Simonyi became one of the early employees at Microsoft. At the Xerox lab, Simonyi had developed an innovative program for creating documents that made it possible for words to be typed in all sorts of sizes and styles, and for graphics to be inserted on the screen. At Microsoft, Simonyi's ideas were put into Microsoft Word, the writing and editing program that is one of the most widely used software products in the world.

When I visited Simonyi a few years ago, we met at his house, a striking modern home of glass, steel, and wood that sweeps down a hillside to the edge of Lake Washington, near Seattle. His house has its own library, fitness center, swimming pool, and computer lab. On the walls hang original works by modern artists like Roy Lichtenstein and Jasper Johns. Besides art, Simonyi collects jets. He has two, including a NATO fighter jet, which he flies. He's even flown to the International Space Station as a space tourist. He has made multimillion-dollar philanthropic donations to Oxford University, the Institute of Advanced Study in Princeton, and elsewhere. He is a billionaire, and he owes it all to his extraordinary ability to talk to computers in the language of programming code — along with good timing and good fortune.

His life story personifies the rise of modern computing over the past fifty years, as it evolved from a lab experiment to a huge, rich industry, and from a technology for an elite few to one used by everyone. Even today, Simonyi thinks computers are still far too difficult for ordinary people to program. The computer revolution, he insists, is just getting underway.

Still, Simonyi cannot help but marvel at the pace of progress in a few decades, a very short time in the sweep of history. "Even with the primitive tools we use, look at how much computing can do," he observed. "It's amazing."

It's ironic that computers, developed by independent thinkers and revolutionaries, have become so commonplace, so much a part of the fabric of everyday life, that they often go unnoticed, as if hiding in plain sight. You may not see them, but chances are, they're there. They surround us — not just personal computers at home and school, but also cell phones, handheld music players, televisions, kitchen appliances, and supermarket cash registers are computing devices of one kind or another. Cars rely as much on software and semiconductors as on gears and pistons these days. The list goes on and on.

Not so long ago, however, computers were rare things, indeed. At the dawn of modern computing in the 1940s and 1950s, computers were hulking room-sized machines. They required teams of white-suited scientists to keep them up and running. The scientists fixed them by prowling around inside the machine, seeing where some wire had come loose and plugging it into the correct circuit by hand. Yet one of those room-sized mechanical behemoths could not match the computing power of a single microchip no larger than the tip of your finger today.

As computers have become smaller, their influence has become greater and greater. Almost anywhere you look, the impact of computing is being felt, more so all the time, and often in ways we don't fully appreciate. We take e-mail, instant messages, and Web-based social networks, such as Facebook, for granted. Yet these computing tools subtly shape how we communicate and deal with each other.

In one field after another, computing plays an increasing role. In medicine, for example, if you get an injury playing soccer or basketball that could be serious, chances are the doctor will take a picture of the inside of the injured arm or leg. Those images are captured, refined, and analyzed using computer technology. Only in the last decade has the use of that medical imaging technology become routine. In biology, another science transformed by computing, human genome research seeks to explore the mysteries of life based on the understanding that DNA, which carries our genetic information, is much like computer code. The machine code that a computer processes is digital, ones and zeros, while there are four chemical building blocks in DNA, designated A, G, C, and T. Humans, to be sure, are not computers. But there are some profound similarities: Humans process coded genetic information, while computers process digital information in code.

Computer-generated simulations are vital tools in the hunt for new drugs, in oil exploration, economic modeling, and weather prediction. These simulations are much like "The Sims" computer game, where assumptions about "cyberpeople" on the screen — what they do and how they spend their time — determine how a virtual city unfolds.

In drug discovery, the molecular structure of experimental compounds and how they might interact in the body are simulated. In oil exploration, the known geologic information about rock formations, heat, and pressure is fed into computers to create an underground picture to suggest where, thousands of years ago, oil might have formed — and thus where to drill. In economics, computer simulations are used to predict how people and markets might behave if the price or supply of goods is changed. Early warnings of global warming came from a NASA climate scientist, based on his computer simulations.

Our popular culture runs on computers, from music to movies. In Hollywood, when a director needs a killer storm, the Roman Coliseum, King Kong, flesh-eating raptors, or a superhero-caliber stunt, it's no problem. All those and much more have been conjured up to frighten and thrill thanks to the magic of computer-generated special effects.

Computing has also permanently changed the practice of politics. The Web is a democratic, bottom-up technology that resists control. A Web log, or blog, can be created by anyone with a personal computer, access to the Internet, and a little skill. In a democracy, such as the United States, bloggers represent a new voice in the nation's political debate. But the authorities in places like China play a constant cat-and-mouse game with political bloggers, shutting down blogs that then pop up again a short time later. Blog posts can be mean-spirited or untrue, but they can also be insightful and uncover new information. Politicians, newspapers, and television networks all must pay attention to what is being said in the "blogosphere" — another computing innovation.

The benefits are many. When a friend moves to another state or another country, thanks to the Internet we can easily stay in touch in a way that was unimaginable in the days of paper letters sent by regular mail, delivered days or weeks after they were written. But the convenience and immediacy of Internet communications doesn't come free. There is a cost, or at least a social risk. A thoughtless remark, an insult, or a lie can last a long time in an e-mail message, on a blog or on a Web site posting, souring a friendship, tarnishing someone's reputation, or worse. The same words, if spoken, might be missed or quickly forgotten.

All tools can be used for good or ill. A hammer can be used to build a house or as a weapon. It depends not on the tool but on the person using it. The same is true, of course, of computer tools. The Internet, for example, makes it easy to reach out and touch someone. But when we do reveal ourselves to others, we become increasingly accessible to strangers. Still, the computer and the global computer network, the Internet, are universal tools — their uses limited mainly by the imagination of the people using and programming them. Luckily for us, those computer scientists whose imagination propels technology forward are eager to envision the possible uses for these tools and work to develop the hardware and software even before they have an exact map of how. They lead technology more often than not.

Computer science is not just for engineers anymore. Today's technologists can be found in medicine, law, media, and the arts as well as all the sciences. These "new geeks" are a world apart from the old stereotype of the computer nerd as a cubicle drone more comfortable with machines than with people.

Kira Lehtomaki is one of these young technologists. For her, the appeal of computing was art, not science. Art projects, not computers, were her childhood passion. And when, as a three-year-old, she first saw a videotape of the Disney film Sleeping Beauty, she decided what she wanted to do when she grew up. She wanted to be one of those artists who stirred life into the animated characters in movies using pencils, paper, and imagination — an animator.

The dreams of most three-year-olds are soon forgotten, but not Kira's. Growing up, she drew constantly, studied art, and read all the books she could find on animation and how it was done at the Walt Disney Company. Disney created animated movies, starting with Snow White and the Seven Dwarfs in 1937, and defined the art form for the next sixty years with classics ranging from Pinocchio and Dumbo to Beauty and the Beast and The Lion King. One summer, Kira even took a job at Disneyland in southern California as a "cookie artist" — painting designs and Mickey Mouse faces — because that job allowed her to spend a couple of days watching how things were done at Disney's animation studio.

The Disney tradition was the pencil-and-paper way. But by the late 1990s, hand-drawn animation was being challenged by something new — computer generated animation. Kira resisted the new technology at first, but was won over when, as a nineteen-year-old college student, she saw Monsters Inc. in 2001, the fourth feature film from the pioneer of computer animation, Pixar Animation Studios.

"I was just blown away," she recalled, by the range of expression and movement in the buddy film, as the hulking, hairy Sulley and his one-eyed sidekick, Mike, led a benevolent revolt at their scream factory.

Kira took up computer graphics in college. And at twenty-three, she was a researcher at the University of Washington's animation labs in Seattle and traveling regularly to Northern California to train with the animators at Pixar. "These two worlds of art and computing are really merging, and, if anything, they will blend even more," she observed. (Kira later became an animator for Disney, and her first credit was for the movie Bolt in 2008.)

Some years ago, I spent several days at Pixar's waterfront offices along the San Francisco Bay. The people and corporate culture of Pixar show how computer technology is spreading and evolving into hybrids of computing and something else — in this case, art and entertainment. Pixar was founded and built by people who deeply understood that computers were powerful creative tools instead of merely big number-crunching calculators. They sustained their belief for years, when many others doubted. And they were proven right, again and again. For its part, Disney was so impressed by Pixar's track record that it bought the company in early 2006 for more than seven billion dollars. Yet the Pixar sale was essentially a reverse takeover: Pixar's leaders were put in charge of overseeing Disney's animation studio, which had fallen behind.

When I visited Pixar, it was clear the place was designed to encourage creativity and fun. The hallways were marked with blue plastic signs bearing names like Toy Boulevard and Scooter Run. In fact, workers periodically zipped by on white scooters, ringing bicycle bells. Animators and artists worked in partitioned cubicles, each decorated differently and each lit by the glow from a computer screen.

Old couches, lava lamps, Art Deco mirrors, and Japanese screens were all part of the mix. Playful artillery fire — from Nerf ball bazookas — whizzed by. Humor was in the air. For his telephone message, Andrew Stanton, the co director of A Bug's Life, who would later direct Finding Nemo and Wall-E, spoke in a mock Eastern European accent, with eerie music playing in the background: "This is Andrew Stanton's psychic hotline. Since I already know why you're calling, you can hang up at the sound of the beep."

The head of Pixar's creative team was John Lasseter, a former Disney animator. He grew up in Whittier, California, a short ride from the Disneyland theme park in Anaheim. As a teenager, he worked as a boat skipper on the Jungle Cruise. Disney blood, Lasseter often says, runs through his veins.

Lasseter's office had the look of a cluttered child's room. A gumball machine stood beside his desk. Big boxes of Cheerios™ and Lucky Charms™ cluttered the desktop. A collection of vintage toys — original versions of Mr. Potato Head™, Slinky™, and others — rested on a shelf. Lasseter, a round-faced man with oval glasses, wore a short-sleeved shirt adorned with cartoon characters. He took a child's delight in showing off his toys. "Isn't that one great?" he said, pointing to a favorite.


(Continues...)

Excerpted from Digital Revolutionaries by Steve Lohr. Copyright © 2009 The New York Times. Excerpted by permission of Roaring Brook Press.
All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.
Excerpts are provided by Dial-A-Book Inc. solely for the personal use of visitors to this web site.

Table of Contents

Contents

1 New Geeks and Nemo,
2 Poppa and Peanut Butter Sandwiches,
3 From BASIC to Billionaires,
4 Icons and iPods,
5 "Some Amazing Things Are Going to Happen",
Timeline,
Articles,
Source Notes,
Photo Credits,
Further Reading,
Acknowledgments,
Index,

From the B&N Reads Blog

Customer Reviews