How Computers Work with CDROM, Millennium Edition

How Computers Work with CDROM, Millennium Edition

4.6 3
by Ron White, Timothy Edward Downs, Stephen Adams
     
 

How Computers Work, Millenium Edition, shows you how chips, software, memory, and hardware work using detailed four-color drawings. An animated, full-color CD-ROM takes you directly inside your computer. This book is completely updated and revised to include the latest technology developments including the Internet, multimedia sound and video, Pentium III/ Xeon//… See more details below

Overview

How Computers Work, Millenium Edition, shows you how chips, software, memory, and hardware work using detailed four-color drawings. An animated, full-color CD-ROM takes you directly inside your computer. This book is completely updated and revised to include the latest technology developments including the Internet, multimedia sound and video, Pentium III/ Xeon// Celeron processors, DVD drives, digital cameras, color printing, Y2K, and ebooks.

Product Details

ISBN-13:
9780789721129
Publisher:
Que
Publication date:
09/22/1999
Series:
Computers Series
Pages:
432
Product dimensions:
8.00(w) x 10.02(h) x 0.93(d)

Read an Excerpt


Part 1: Boot-Up Process

Overview

...How Computers Used to Work

At the end of the 20th century, computers are such complex beasts-despite their relative youth-that it's difficult to imagine how such elaborate contraptions could have sprung fully grown from the brows of their creators. The explanation is, of course, that they didn't. The development of computers has been an evolutionary process and often it's well nigh impossible to figure out which came first, the chicken of software or the egg of hardware.

Human attempts to create tools to manipulate data date back at least as far as 2600 B.C. when the Chinese came up with the abacus. Leonardo da Vinci came up with a mechanical calculator. When the slide rule was invented in 1621, it remained the mathematician's tool of choice until the electronic calculator took over in the early 1970s.

All the early efforts to juggle numbers had two things in common: They were mechanical and they were on a human scale. They were machines made of parts big enough to assemble by hand. Blaise Pascal's Arithmetic Machine used a system of gears turned by hand to do subtraction and addition. It also used punch cards for storing data, a method that's survived well into the 20th century.

In 1830 Charles Babbage invented-on paper-the Analytical Engine, which was different from its predecessors because, based on the results of its own computations, it could make decisions; such as sequential control, branching, and looping. But Babbage's machine was so complex that he died in 1871 without finishing it. It was built between 1989 and 1991 by dedicated members of the Science Museum in London. It was the physical size and complex mechanics of these mechanisms that limited their usefulness; they were only good for a few tasks, and they were not something that could be mass produced.

Mechanical devices of all types flourished modestly during the first half of the 20th century. Herman Hollerith invented a mechanized system of paper cards with holes in them to tabulate the U.S. census. Later, in 1924, Hollerith's Computing-Tabulating-Recording Company changed its name to International Business Machines.

Although no one could have known it at the time, the first break-through to the modem computer was in 1904 when John Ambrose Fleming created the first commercial diode vacuum tube, something Thomas Edison had discovered and discarded as worthless. The significance of the vacuum tube is that it was the first step beyond the human scale of machines. Until it came along, computations were made first by gears and then by switches. The vacuum tube could act as a switch turning on and off thou, sands of times faster than mechanical contraptions.

Vacuum tubes were at the heart of Colossus, a computer created by the British during World War 11 to break the codes produced by the German's Enigma encrypting machine. And the Germans reportedly came up with a general purpose computer--one not limited to a specific task as Colossus was. But the German invention was lost or destroyed in the war.

The war also gave birth to ENIAC (Electronic Numerical Integrator Analyzer and Computer), built between 1943 and 1945 by the U.S. Army to produce missile trajectory tables. ENIAC performed 5,000 additions a second although a problem that took two seconds to solve required two days to set up. ENIAC cost $500,000, weighed 30 tons, and was 100 feet long and 8 feet high. It contained 1,500 relays and 17,468 vacuum tubes.

Those same tubes that made ENIAC possible in the first place, were also its Achilles' heel. Consuming 200 kilowatts of electricity each hour, the tubes turned the computer into an oven, constantly cooking its own components. Breakdowns were frequent. What was needed was something that did the job of the tubes without the heat, bulk, and fragility. And that something had been around since 1926.

In 1926 the first semiconductor transistor was invented, but it wasn't until 1947, when Bell Labs' William Shockley patented the modern solid-state, reliable transistor, that a new era in computing dawned. The transistor did essentially the same thing a vacuum tube did-control the flow of electricity-but it was the size of a pea and generated little heat. Even with the transistor, the few computers built then still used tubes. It wasn't until 1954, when Texas Instruments created a way to produce silicon transistors commercially, that the modern computer took off. That same year IBM introduced the 650, the first mass-produced computer. Business and the government bought 120 of them the first year.

Four years later, Texas Instruments built the first integrated circuit by combining five separate components and the circuitry connecting them on a piece of germanium half an inch long. The integrated circuit led to the modern processor and has made a neverending contribution to smaller and smaller computers.

The computer grew increasingly smaller and more powerful, but its cost, complexity, and unswerving unfriendliness kept it the tool of a technological elite. It wasn't until 1975 that something resembling a personal computer appeared. The January issue of Popular Electronics features on its cover something called the Altair 8800, made by Micro Instrumentation and Telemetry Systems (MITS). For $367 customers got a kit that included an Intel 8080 microprocessor and 256 bytes of memory. There was no keyboard; programs and data were both entered by clicking switches on the front of the Altair. There was no monitor. Results were read by interpreting a pattern of small red lights. But it was a real computer cheap enough for anyone to afford. MITS got orders for 4,000 within a few weeks.

The new computer was at first a toy for hobbyists and hackers. They devised clever ways to expand the Altair and similar microcomputers with keyboards, video displays, magnetic tape, and then diskette storage. Two hackers-Stephen jobs and Steve Wozniak-created a personal computer that came complete with display, built-in keyboard, and disk storage, and began hawking it at computer clubs in California. They called it the Apple, and it was the first personal computer that was powerful enough, and friendly enough, to be more than a toy. The Apple, along with computers from Radio Shack and Commodore, began appearing in businesses, sometimes brought in behind the backs of the people in white lab coats who ran the "real," mainframe computers in a sealed room down the hall. The information services-or IS, as the professional computer tenders came to be called-disparaged the new computers as toys and at the same time they saw microcomputers as a threat to their turf.

The development that finally broke the dam, unleashing microcomputers on a society that would forever after be different, was not a technical invention. It was a marketing decision IBM made when creating its first personal computer, the IBM PC.

IBM wanted to keep the price down, and so it decided to build the computer from components that were already available, off the shelf, from several suppliers. IBM also made the overall design of the PC freely available to competitors. The only part of the machine IBM copyrighted was the BIOS, the crucial basic input/output system, computer code residing in a single chip that defined how software was to interact with the PC's hardware. The competition could create their own PCs as long as they duplicated the operations of IBM's BIOS without directly copying it.

While Apple continued to keep its design proprietary, IBM's openness encouraged the creation of IBM clones that could use the same software and hardware add-ons the PC used. And the clones, while competing with IBM, at the same time helped establish the IBM architecture as the machine that software and add-on hardware developers would design for. Precisely because the IBM PC was an evolutionary rather than revolutionary creation, it was able to create the critical mass needed to bring personal computers into every office and home...

Read More

Customer Reviews

Average Review:

Write a Review

and post it to your social network

     

Most Helpful Customer Reviews

See all customer reviews >