Uh-oh, it looks like your Internet Explorer is out of date.
For a better shopping experience, please upgrade now.
Beth Porter presents The Net Effect exploring not just how it evolved and what it does, but how it relates to the way we live. Most writing about the Net focuses on a particular aspect: its use for business, its driving technology, etc. This book aims at a broader target. It does contain some useful information about the How of the Internet, but it is more concerned with the Why of it.
|Publisher:||Intellect Books Ltd|
|Sold by:||Barnes & Noble|
|File size:||1 MB|
Read an Excerpt
The Net Effect
By Beth Porter
Intellect LtdCopyright © 2001 Intellect Ltd
All rights reserved.
section 1: the long & winding slip-road
Unlike the entrance to a real-life motorway, the slip-road to the Information SuperHighway meanders across surprising landscapes, accumulating helpful roadsigns along the way. Over the centuries systems of distance message relay were supplemented by smoke signals, drums, fire beacons, semaphore, the telegraph, Morse code, and the telephone. They all used symbolic means to send complex data transmitted as small sequential parcels, and not all served the purpose for which they were originally intended. Add the elements of electricity, transistors, and miniaturisation, and it's easy to fit the development of modern computers into the evolving story of connecting our species. Navigation tip: keep a lookout for how often we hitch a lift with the Navy.
As we begin the 21st century, most people recognise a computer in the form of a laptop or personal desk-top at home or at work. It sure didn't start like that!
Anatomy of the Elegant Box
The computer box contains a central processing unit [CPU] which connects to a display device [monitor], pointing device [mouse], various input devices [keyboard, scanner], and various output devices [printer, fax]. Let's trace how its origins as a calculating device resulted in this electronic box of delights. For that we have to re-visit Europe in the mid-17th century.
Various engineers and mathematicians had been developing analogue mechanisms allowing increasingly more comprehensive ranges of calculation. Notable among them was a gear-driven prototype for the first mechanical calculator built in 1623 by University of Tübingen Professor Wilhelm Schickard, who was a friend of the noted astronomer Johannes Kepler. The principles of Schickard's so-called 'calculating clock' which automated addition and subtraction and provided hints for multiplication] would probably have speeded up the development of more sophisticated machines had not the plans been twice misplaced: once contemporaneously during the Thirty Years War from when they remained undiscovered until 1935, and again during WWII until they reappeared in 1956.
In the 1640s, Blaise Pascal, the French mathematical prodigy and philosopher, improved on Shickard's model using a system of rising and falling weights, but the gears on his 'Pascaline' tended to stick, affecting its accuracy. In 1654 Robert Bissaker invented the mechanical slide rule which was used extensively for hundreds of years as the key calculation device.
By 1674 Gottfried von Leibnitz, primarily remembered for inventing calculus, also devised the first semi-automated multiplication system called the 'Stepped Reckoner,' constructed for him by a Monsieur Oliver in Paris. It could handle up to 16 digits, though wasn't entirely reliable. During the following century, others improved somewhat on Leibnitz's design including the Third Earl of Stanhope in England, Mathieus Hahn, and J. H. Mueller, the latter two from Germany.
Then in 1820, a Frenchman Charles de Colmar completed the first mass-produced calculator, which he dubbed the 'Arithmometer.' A commercial success, it remained in use until the turn of the century. Of course it could add and subtract. It also improved on Leibnitz's multipliers and even allowed elementary division.
All these mathematical functions were incorporated into the early analogue computation systems of the late 19th century, first using elaborate rotating gears and shafts, then electronic and hydraulic systems rather than numerical input. They were put to a variety of civil and military uses. For example, British physicist Lord Kelvin devised such a mechanical system to predict tide times, and by WWI similar systems could variously foretell submarine torpedo courses or control aircraft bombsights.
It wasn't until the 1830s that Charles Babbage and the Countess of Lovelace theorised about automating the process with the Analytical Machine or Engine, which could theoretically handle up to 40 digits.
The idea was to devise a more automated mechanism than the slide rule, one which could solve any mathematical problem. They hoped to use thin boards punched through with holes in strategic places, similar to the pattern boards of the Jacquard loom, then employed in the weaving industry. These punch-boards stored data which could then be used in various mathematical calculations. Babbage estimated addition speeds of a few seconds, more complex calculations in under 5 minutes.
The Countess, whom Babbage referred to as "The Enchantress of Numbers," published an analysis of the Engine, declaring it would "weave algebraic patterns just as the Jacquard loom weaves flowers and leaves." She went on to theorise many of the basics still employed in computer programming, including memory storage, data analysis, sub-routines and looping [which she compared to a "snake biting its tail"]. The Countess, Augusta Ada King, was the daughter of the great Romantic poet Lord Byron, and it is after her that the computer language Ada is named. Sadly the Analytical Machine would have required the power of five steam engines and taken up an entire football pitch, so Babbage and the Countess never saw it built.
The principles, however, had been outlined, and it was left to the 29-year-old American Herman Hollerith to patent a mechanism in 1889 which calculated census data by passing a series of punch-cards through an electrical contact. Seven years later Hollerith founded the Deutsche Hollerith Maschinen Gesellschaft [Dehomag] which became the German subsidiary of the International Business Machine Company in 1924. IBM was born.
Another vital contribution had already been introduced by British mathematician George Boole, who, in the 1850s, devised an algebraic system using only two digits: one and zero which is why it's known as the binary system. Reducing the number to two allows symbolic representation via switches in either the on or off position. Such switches can be connected to a variety of storage, transmission and/or recording devices.
However complex the combination of numbers, they're represented by ones and zeros, commanding the position of switches to relay the information. And, since the information can represent any kind of data [words, numbers, images, etc], this Boolean logic is what still drives data transfer in today's computers.
The millions of minute on/off switches mounted on a microchip are now called bits, eight of which compose each byte. The speed of data transfer is measured in bytes per second. This forms the basis of representing and manipulating information in small groups or packets, much like the smoke signal or telegraph principle. Complementary to the switching system was the invention in 1907 of the triode vacuum tube by Lee Deforest. The tube or valve was essentially an amplification device for radio, telegraphy, and telephony. All automated computing devices relied on such valves prior to transistors.
After serving in the US Navy as a research engineer during WWI, a scientist and educator named Vannevar Bush invented the differential analyser, a more sophisticated automated solution for complex maths problems. He later chaired the Defense Research Committee for the US government, which as we'll see contributed to the development of much more than a range of calculation systems.
So, given the long and cumulative evolution of its various functional components, instead of approaching the Net with trepidation as something alien and impenetrable, we can assign this electronic delivery system its place among all the other communications networks which our ancestors devised. Unlike all the others, however, this one allows us to engage in a multiplicity of simultaneous data transfer often referred to as multi-media, since it encompasses various combinations of text, speech, music, and both still and moving images. It's worth highlighting how it evolved to manage all that, if only to underline how far it's come from its original purpose.
* * *
The War That Changed Everything
As well as all the urgent and military-inspired technological advance documented below, World War II effected significant social change. This, more than any other organised conflict since pre-historical tribal clashes, saw merit, expertise and intellect valued more highly than rank, class and privilege. How people behaved and the results of assignments began to matter more than their ancestry, what school they'd attended, or their bank balance. Particularly for the Allies, though not universally, the axiom proved truer the longer the war raged. When peace finally came, these seeds of social change sprouted unexpectedly in the field of computer and Internet development.
In 1937 a brilliant maths professor at England's Cambridge University, Dr Alan Turing, as fascinated by the process of problem-solving as Vannevar Bush, published a paper setting out all the principles for a simplified computing language eponymously known as Turing's Universal Machine.
For his biographer Andrew Hodges, Turing's "total originality lay in seeing the relevance of mathematical logic to a problem originally seen as one of physics.
In this paper, as in so many aspects of his life, Turing made a bridge between the logical and the physical worlds, thought and action, which crossed conventional boundaries." Both Bush and Turing figure prominently in our story. Each also illustrates the very different support systems of their respective countries.
As the conflict of 1939 escalated fatefully into World War II, communications over vast distances became vital to coordinate activities for both sides. A safer, more sophisticated system was required than had served during the previous War, when 80,000 homing pigeons became military messengers for the British Air Force. The new military command units required a method of closed communications between high level field reporting and the collective HQs. Capitalising on Marconi's production of wireless messages between Italy and the UK at the start of the century, both the Allies and the Axis set up closed telephone networks to allow such private conversations. In effect, this was the prototype for the principle of the Internet.
Though precise information about developments in relevant countries is still protected under Official Data legislation, we know several European military agencies had long been funding research on various forms of encryption devices, decoders, calculators, and processors. Particularly in America, much of the research was assigned to university departments, cementing a relationship between the military and academia which has stimulated decades of ethical debate tabulating the moral price of technological progress.
With the war in 1914 had come an imperative to create, intercept and interpret military coding systems. Instead of the ad hoc employment of espionage networks used by governments for centuries, Military Intelligence became official. Of course several European countries had already set-up top-secret codebreaking units, with a watchful eye on German re-armament after their WWI defeat. The British operation, established by Naval Intelligence in 1919, housed a cryptography team of over 50 personnel in Room 40 at the Admiralty. The unit went by the almost cosy-sounding title of the Government Code and Cypher School, or GCCS, which was rapidly given the sobriquet Golf, Cheese and Chess Society. Three years later the Foreign Office took control of its activities. We'll soon see how valuable it became.
The US State Department, too, established MI-8. Referred to as the Black Chamber, it was based in New York and headed by Herbert Yardley. The unit, under the aegis of the Army, successfully decrypted various foreign diplomatic codes, including the Japanese, until it was suspended in 1929. After Yardley's departure, William Friedman led the team of cryptanalysts; he is now honoured as the Father of American Cryptology.
Meanwhile, the US Navy had addressed similar problems since the 1920s. Cryptanalysts Laurance Safford and his civilian associate Agnes Driscoll focused on deciphering naval encryption systems, recruiting a support team to crack WWII Japanese naval codes.
For encryption purposes, the American Army had been working with the Choctaw tribe to produce secure voice communications on the battlefields of WWI. By the time the US entered the Second World War, many native Americans were selected for similar purposes including Cherokees, Choctaws, Commanches, Hopis, Kiowas, Navahos, Seminoles, and Winnebagos.
The Army's sister-force the Marines also used what they called codetalkers, relying exclusively on members of the Navaho tribe. It's worth noting the record of these tribesmen was 100% in transmitting messages which were indecipherable to the enemy. Their invulnerability was primarily due to the Navahos' phenomenal ability to keep all the coding templates in their head – since nothing was written down, the Axis had no means of deciphering.
In 1922 Arthur Scherbius had developed a highly complex mechanical coding system called the Enigma Machine for use by banks and railways, which German Military Intelligence bought for the Navy in 1926. Its various stages of development are crucial to the story of how computers evolved into devices sophisticated enough to service the Internet.
The fundamental principle of the Enigma is a series of code letters, sequentially displayed on notched wheels called rotors. These are linked to another series of letter equivalents or cribs, with the proviso that no letter can ever equal itself. The greater the number of rotors, the greater the number of coding permutations. Once again, I want to concentrate on the inspirational process rather than the intricacies of the hardware.
With customary efficiency, extensive documentation was prepared to accompany Enigma, including set-up instructions and coding examples. Whether by fair means or foul, a set of these papers was obtained by an ambitious clerk in the German army. In 1931 he actually sold them to the French Secret Service, where their significance was totally ignored by the cryptography unit. When he later approached the GCCS, the clerk didn't even make a sale because the perceived wisdom in the UK was that Enigma simply couldn't be cracked. This was an error which added years onto the ensuing war.
Both civilian and military versions of Enigma were manufactured and some were sold abroad – in fact one was bought by the Polish government. One notable version the V1 [later renamed the Z1], was the brainchild of an engineer named Konrad Zuse and his assistant Helmut Schreyer, credited by many as the inventors of the first mechanical binary programmable calculator; it printed results on redundant 35mm celluloid.
In Poland [already in possession of a commercial Enigma machine] Marian Rejewski led a small decryption unit which broke the German code in 1932 with a system they called a Bomba, allegedly because it made a ticking sound. Though their contribution to the story cannot be underestimated, their success served only to encourage even more complexities of Enigma.
As war approached in 1939 the Poles, with ever increasing reasons to foil German efforts, passed on their deciphering work to the British during a secret meeting near Pyry. But the Allies had their work cut out. Throughout WWII the Germans employed different versions of Enigma for each of their military services; each version depended on its own codes. Cracking any one code wouldn't get you very far because the decryption keys were changed every day. These keys depended on a manual intercept of which ciphers represented which, itself a complex and random process. Subsequent Enigma incarnations boasted even more intricate rotor and cable mechanisms. By 1941 Zuse, backed by the German Aeronautical Research Institute, improved on his previous civil prototype and applied to Enigma the same concepts of binary mathematics and Boolean logic to produce a programmable electronic calculator.
At its most complex, the Enigma produced 150 trillion coding permutations! The Axis Powers were thrilled – Hitler's war machine enjoyed every strategic advantage. Archived memos of the time reveal the German command was convinced that Enigma rendered them invincible. It certainly was an elegant system, albeit not in appearance.
Excerpted from The Net Effect by Beth Porter. Copyright © 2001 Intellect Ltd. Excerpted by permission of Intellect Ltd.
All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.
Excerpts are provided by Dial-A-Book Inc. solely for the personal use of visitors to this web site.
Table of Contents
introduction: the Janus approach,
section 1: the long & winding slip-road,
section 2: clear-eyed acumen & blind dreams,
section 3: the da Vinci syndrome,
section 4: the internet experience,
coda: quo vadis,