- Shopping Bag ( 0 items )
Ships from: Las Vegas, NV
Usually ships in 1-2 business days
Human attempts to create tools to manipulate data date back at least as far as 2600 B.C. when the Chinese came up with the abacus. Leonardo da Vinci came up with a mechanical calculator. When the slide rule was invented in 1621, it remained the mathematician's tool of choice until the electronic calculator took over in the early 1970s.
All the early efforts to juggle numbers had two things in common: They were mechanical and they were on a human scale. They were machines made of parts big enough to assemble by hand. Blaise Pascal's Arithmetic Machine used a system of gears turned by hand to do subtraction and addition. It also used punch cards for storing data, a method that's survived well into the 20th century.
In 1830 Charles Babbage invented-on paper-the Analytical Engine, which was different from its predecessors because, based on the results of its own computations, it could make decisions; such as sequential control, branching, and looping. But Babbage's machine was so complex that he died in 1871 without finishing it. It was built between 1989 and 1991 by dedicated members of the Science Museum in London. It was the physical size and complex mechanics of these mechanisms that limited their usefulness; they were only good for a few tasks, and they were not something that could be mass produced.
Mechanical devices of all types flourished modestly during the first half of the 20th century. Herman Hollerith invented a mechanized system of paper cards with holes in them to tabulate the U.S. census. Later, in 1924, Hollerith's Computing-Tabulating-Recording Company changed its name to International Business Machines.
Although no one could have known it at the time, the first break-through to the modem computer was in 1904 when John Ambrose Fleming created the first commercial diode vacuum tube, something Thomas Edison had discovered and discarded as worthless. The significance of the vacuum tube is that it was the first step beyond the human scale of machines. Until it came along, computations were made first by gears and then by switches. The vacuum tube could act as a switch turning on and off thou, sands of times faster than mechanical contraptions.
Vacuum tubes were at the heart of Colossus, a computer created by the British during World War 11 to break the codes produced by the German's Enigma encrypting machine. And the Germans reportedly came up with a general purpose computer--one not limited to a specific task as Colossus was. But the German invention was lost or destroyed in the war.
The war also gave birth to ENIAC (Electronic Numerical Integrator Analyzer and Computer), built between 1943 and 1945 by the U.S. Army to produce missile trajectory tables. ENIAC performed 5,000 additions a second although a problem that took two seconds to solve required two days to set up. ENIAC cost $500,000, weighed 30 tons, and was 100 feet long and 8 feet high. It contained 1,500 relays and 17,468 vacuum tubes.
Those same tubes that made ENIAC possible in the first place, were also its Achilles' heel. Consuming 200 kilowatts of electricity each hour, the tubes turned the computer into an oven, constantly cooking its own components. Breakdowns were frequent. What was needed was something that did the job of the tubes without the heat, bulk, and fragility. And that something had been around since 1926.
In 1926 the first semiconductor transistor was invented, but it wasn't until 1947, when Bell Labs' William Shockley patented the modern solid-state, reliable transistor, that a new era in computing dawned. The transistor did essentially the same thing a vacuum tube did-control the flow of electricity-but it was the size of a pea and generated little heat. Even with the transistor, the few computers built then still used tubes. It wasn't until 1954, when Texas Instruments created a way to produce silicon transistors commercially, that the modern computer took off. That same year IBM introduced the 650, the first mass-produced computer. Business and the government bought 120 of them the first year.
Four years later, Texas Instruments built the first integrated circuit by combining five separate components and the circuitry connecting them on a piece of germanium half an inch long. The integrated circuit led to the modern processor and has made a neverending contribution to smaller and smaller computers.
The computer grew increasingly smaller and more powerful, but its cost, complexity, and unswerving unfriendliness kept it the tool of a technological elite. It wasn't until 1975 that something resembling a personal computer appeared. The January issue of Popular Electronics features on its cover something called the Altair 8800, made by Micro Instrumentation and Telemetry Systems (MITS). For $367 customers got a kit that included an Intel 8080 microprocessor and 256 bytes of memory. There was no keyboard; programs and data were both entered by clicking switches on the front of the Altair. There was no monitor. Results were read by interpreting a pattern of small red lights. But it was a real computer cheap enough for anyone to afford. MITS got orders for 4,000 within a few weeks.
The new computer was at first a toy for hobbyists and hackers. They devised clever ways to expand the Altair and similar microcomputers with keyboards, video displays, magnetic tape, and then diskette storage. Two hackers-Stephen jobs and Steve Wozniak-created a personal computer that came complete with display, built-in keyboard, and disk storage, and began hawking it at computer clubs in California. They called it the Apple, and it was the first personal computer that was powerful enough, and friendly enough, to be more than a toy. The Apple, along with computers from Radio Shack and Commodore, began appearing in businesses, sometimes brought in behind the backs of the people in white lab coats who ran the "real," mainframe computers in a sealed room down the hall. The information services-or IS, as the professional computer tenders came to be called-disparaged the new computers as toys and at the same time they saw microcomputers as a threat to their turf.
The development that finally broke the dam, unleashing microcomputers on a society that would forever after be different, was not a technical invention. It was a marketing decision IBM made when creating its first personal computer, the IBM PC.
IBM wanted to keep the price down, and so it decided to build the computer from components that were already available, off the shelf, from several suppliers. IBM also made the overall design of the PC freely available to competitors. The only part of the machine IBM copyrighted was the BIOS, the crucial basic input/output system, computer code residing in a single chip that defined how software was to interact with the PC's hardware. The competition could create their own PCs as long as they duplicated the operations of IBM's BIOS without directly copying it.
While Apple continued to keep its design proprietary, IBM's openness encouraged the creation of IBM clones that could use the same software and hardware add-ons the PC used. And the clones, while competing with IBM, at the same time helped establish the IBM architecture as the machine that software and add-on hardware developers would design for. Precisely because the IBM PC was an evolutionary rather than revolutionary creation, it was able to create the critical mass needed to bring personal computers into every office and home...
I. BOOT-UP PROCESS.1.Getting to Know the Hardware.
II. HOW SOFTWARE WORKS.6. How Programming Languages Work.
III. MICROCHIPS.10. How a Transistor Works.
IV. DATA STORAGE.14. How Disk Storage Works.
V. INPUT/OUTPUT DEVICES.20. How Energy Turns Into Data.
VI. MULTIMEDIA.32. How CD-ROM and DVD Work.
VII. HOW THE INTERNET WORKS.37. How Local Area Networks Work.
VIII. HOW PRINTERS WORK.43. How Basic Printing Works.
We have our personal computers.
PCs, too, are powerful creations that often seem to have a life of their own. Usually they respond to a seemingly magical incantation typed at a C:> prompt or to a wave of a mouse by performing tasks we couldn't imagine doing ourselves without some sort of preternatural help. But even as computers successfully carry out our commands, it's often difficult to quell the feeling that there's some wizardry at work here.
And then there are the times when our PCs, like malevolent spirits, rebel and open the gates of chaos onto our neatly ordered columns of numbers, our carefully wrought sentences, and our beautifully crafted graphics. When that happens, we're often convinced that we are, indeed, playing with power not entirely under our control. We become sorcerers' apprentices, whose every attempt to right things leads to deeper trouble.
Whether our personal computers are faithful servants or imps, most of us soon realize there's much more going on inside those silent boxes than we really understand. PCs are secretive. Open their tightly sealed cases and you're confronted with poker-faced components. Few give any clues as to what they're about. Most of them consist of sphinx-like microchips that offer no more information about themselves than some obscure code printed on their impenetrable surfaces. The maze ofcircuit tracings etched on the boards is fascinating, but meaningless, hieroglyphics. Some crucial parts, such as the hard drive and power supply, are sealed with printed omens about the dangers of peeking inside, omens that put to shame the warnings on a pharaoh's tomb.
This book is based on two ideas. One is that the magic we understand is safer and more powerful than the magic we don't. This is not a hands-on how-to book. Don't look for any instructions for taking a screwdriver to this part or the other. But perhaps your knowing more about what's going on inside all those stoic components makes them all a little less formidable when something does go awry. The second idea behind this book is that knowledge, in itself, is a worthwhile and enjoyable goal. This book is written to respond to your random musings about the goings--on inside that box that you sit in front of several hours a day. If this book puts your questions to rest-or raises new ones--it will have done its job.
At the same time, however, I'm trusting that knowing the secrets behind the magician's legerdemain won't spoil the show. This is a real danger. Mystery is often as compelling as knowledge. I'd hate to think that anything you read in this book takes away that sense of wonder you have when you manage to make your PC do some grand, new trick. I hope that, instead, this book makes you a more confident sorcerer.
Posted August 13, 2000
I have used this book for the last year as a textbook for computer technology classes I taught for a continuing education program. This book goes into the right depth, and give excellent overviews for such an abbriviated course. Excellent study guide for anyone who wants to know how a PC Works.Was this review helpful? Yes NoThank you for your feedback. Report this reviewThank you, this review has been flagged.
Posted February 11, 2000
I like this book and give 5 stars to Ron White. It is very informative! This book has thought me the in and outs of computers. The book I think is great for students in school. Its the BEST book I ever read!!Was this review helpful? Yes NoThank you for your feedback. Report this reviewThank you, this review has been flagged.
Posted January 5, 2000
This is a must for the New Computer Owner. It covers all aspects of the Computer today in our lives. From what happens when you turn it on to how Email and Internet search engines work. It explains in simple terms and gives lots of good diagrams. The CD is great for kids. If you ever wondered just what was in side the box this is the book for you.
0 out of 1 people found this review helpful.Was this review helpful? Yes NoThank you for your feedback. Report this reviewThank you, this review has been flagged.