Understanding Information Transmission
Understanding Information Transmission introduces you to the entire field of information technology. In this consumer handbook and introductory student resource, seven chapters span the gamut of the field—the nature, storage, transmission, networking, and protection of information. In addition to the science and technology, this book brings the subject alive by presenting the amazing history of information technology, profiling incredible inventions and fascinating inventors, and their dramatic impact on society. Features include problem sets, key points, suggested reading, review appendices, and a full chapter on mathematical methods. Private and public funding of information technology continues to grow at staggering rates. Learn what’s behind this race to be the biggest, brightest, and fastest in the field with Understanding Information Transmission.
1101193470
Understanding Information Transmission
Understanding Information Transmission introduces you to the entire field of information technology. In this consumer handbook and introductory student resource, seven chapters span the gamut of the field—the nature, storage, transmission, networking, and protection of information. In addition to the science and technology, this book brings the subject alive by presenting the amazing history of information technology, profiling incredible inventions and fascinating inventors, and their dramatic impact on society. Features include problem sets, key points, suggested reading, review appendices, and a full chapter on mathematical methods. Private and public funding of information technology continues to grow at staggering rates. Learn what’s behind this race to be the biggest, brightest, and fastest in the field with Understanding Information Transmission.
92.95 In Stock
Understanding Information Transmission

Understanding Information Transmission

Understanding Information Transmission

Understanding Information Transmission

Paperback

$92.95 
  • SHIP THIS ITEM
    In stock. Ships in 1-2 days.
  • PICK UP IN STORE

    Your local store may have stock of this item.

Related collections and offers


Overview

Understanding Information Transmission introduces you to the entire field of information technology. In this consumer handbook and introductory student resource, seven chapters span the gamut of the field—the nature, storage, transmission, networking, and protection of information. In addition to the science and technology, this book brings the subject alive by presenting the amazing history of information technology, profiling incredible inventions and fascinating inventors, and their dramatic impact on society. Features include problem sets, key points, suggested reading, review appendices, and a full chapter on mathematical methods. Private and public funding of information technology continues to grow at staggering rates. Learn what’s behind this race to be the biggest, brightest, and fastest in the field with Understanding Information Transmission.

Product Details

ISBN-13: 9780471679103
Publisher: Wiley
Publication date: 03/04/2005
Series: IEEE Press Understanding Science & Technology Series , #18
Pages: 320
Product dimensions: 6.10(w) x 9.20(h) x 0.60(d)

About the Author

JOHN B. ANDERSON holds the Ericsson Chair in Digital Communications at Lund University, Sweden. He was formerly a professor at Rensselaer Polytechnic Institute. He is a Fellow of the IEEE and is a recipient of the Humboldt Research Prize and the IEEE Third Millennium Medal. His research and consulting practice concentrates on communication algorithms and bandwidth-efficient coding.

ROLF JOHANNESSON is Professor of Information Theory at Lund University, Sweden, and a Fellow of the IEEE. He was awarded the honor of Professor, honoris causa, from the Institute for Information Transmission Problems, Russian Academy of Sciences. His research interests include information theory, error-correcting codes, and cryptography.

Read an Excerpt

Understanding Information Transmission


By John B. Anderson Rolf Johannesson

John Wiley & Sons

Copyright © 2005 Institute of Electrical and Electronics Engineers, Inc.
All right reserved.

ISBN: 0-471-67910-0


Chapter One

Introduction: First Ideas and Some History

This his book is about the transmission of information through space and time. We call the first communication. We call the second storage. How are these done, and what resources do they take?

These days, most information transmission is digital, meaning that information is represented by a set of symbols, such as 1 and 0. How many symbols are needed? Why has this conversion to digital form occurred? How is it related to another great change in our culture, the computer?

Information transmission today is a major human activity, attracting investment and infrastructure on a par with health care and transport, and exceeding investment in food production. How did this come about? Why are people so interested in communicating?

Our aim in this book is to answer these questions, as much as time and space permit. In this chapter we will start by introducing some first ideas about information, how it occurs, and how to think about it. We will also look at the history of communication. Here lie examples of the kinds of information we live with, and some ideas about why communication is so important. The later chapters will then go into detail about communication systems and the tools we use in working with them. Some of these chapters are about engineering and invention. Some are about scientific ideas and mathematics. Others are about social events and large-scale systems. All of these play important roles in information technology.

1.1 WHAT IS COMMUNICATION?

Communication is the transfer of information.

Information occurs in many ways. It can take the form of postal letters, email, bank statements, parking tickets, voice, music, pictures, moving video. It is easy to name many more. The medium that carries information can be electromagnetic waves, electricity through wires, writing on paper, or smoke signals in the air. The delay of transfer can vary. A short letter can be delivered in tenths of a second in an online chat room, seconds or minutes by email, a day or two by postal priority mail, and perhaps months by low-priority mail. The quality of transfer can also vary. Voice and music can be compact disk quality, ordinary radio quality, telephone quality, or something worse, like a military or taxi radio. The form of information, and the medium, delay, and quality of its transmission, are all things with which engineers must deal, and quantify.

Sometimes information appears in forms that are hard to quantify. A letter is just words, but a voice speaking the same words can be happy, sad, threatening, and so on, and carry more information. Information of this sort can be abstract and almost impossible to describe in words; examples are the feelings and impressions that are present in music or art. Philosophers and mathematicians alike have posed the question, "What is information?" We can give at least three answers. Information can be data, in the sense of a bank statement, a computer file, or a telephone number. Data in the narrowest sense can be just a string of binary symbols. Information can also be meaning. The meaning of the bank statement might be that your account is overdrawn and will now be closed, and the meaning of the symbols in the computer file may be that you have won the lottery and should quit your job. Another idea was proposed by the mathematicians Hartley and Shannon in the middle of the last century. They said that information was the degree that uncertainty was reduced by knowing symbols, and they gave a formula to measure it. Shannon's ideas lie at the heart of the modern engineering idea of information, and we will look at them in detail in Chapter 5. Until then, we will measure information of the symbolic kind by simply counting the symbols.

Many forms of information, such as text and data, are inherently symbolic. Others, such as voice and video, are originally analog waveforms, and can be transmitted or stored in this form or converted to a symbolic form. There are many engineering reasons to do this, and in theory there is no loss in doing so. We will call this process source conversion. It is often called by more technical names, such as source coding, data compression, change of format, or analog-to-digital (A to D) conversion; what these are is only a change in form. There are laws that govern it, and state, for example, how many symbols are needed for voice or video. We will look at that in Chapters 2 and 5. Sometimes information is converted from one symbolic form to another. Emails, for example, are converted from the text symbols A,B,C, ... to the binary symbols 1 and 0. When information is converted to a simple, standard form such as these binary symbols, both the public and engineers alike say that it is in digital form.

A conversion to a form intended for transmission or storage is called modulation. In this process, analog or digital forms are converted to electromagnetic waves as in radio, magnetized regions as in a hard disk, or pits in a film as in a compact disk (CD), to name a few. Within radio and wire modulation many ways exist. These include modulating the amplitude, frequency or phase of a radio or electrical waveform. As an example of the proper use of language, we might say that "voice was converted to digital form and the resulting bits are phase modulated for transmission." Waveform information such as voice and video may be modulated directly in its analog form or converted to digital form. We will take a close look at modulation in Chapter 4.

The need for modulation seems obvious, but it is a good idea to review why it is needed. A good modulation method is one that fits the medium. For example, an efficient means of modulating magnetic domains on a disk may not be the best one on a CD or over a radio link. Radio channels are especially interesting because the physics of antennas and transmission depend very much on frequency. For all sorts of subtle reasons, microwaves are required in space; 20 MHz is suitable for transcontinental transmission during the daytime, 6 MHz is suitable at night, and 20 kHz is needed for submarines at any time. Another reason for modulation is to keep signals from interfering; if two signals are modulated to different radio frequencies, they will not disturb each other.

An interesting philosophical point about all of this is that modulation is inherently an analog process. The last circuit in radio transmitters and disk drives is always an analog one. Symbols do not have length or weight or field intensity, but magnetic domains and electromagnetic waves do. We need to measure this physical reality, and buy and sell and control it.

1.1.1 Noise

A proverb says "Death and taxes are always with us." The equivalent of these in communication is noise. Except at absolute zero-not a very interesting temperature-noise is always with us.

Noise takes many forms. For two people having a conversation, traffic noise may be the problem. For people sending smoke signals, it might be fog. A definition of noise might be any unrelated signal in the same time or place or frequency band. Because most communication is one way or another electrical, it is electrical noise that is most interesting for us, and within this, thermal, or "white" noise. This noise stems from the motions of molecules, something that is present in all materials and at all temperatures above absolute zero. It is true that there is a cosmic background noise in the universe, but most often the dominant source of white noise is the receiver itself, in the front part where the signal is first applied. Sometimes it is this unavoidable noise level that limits communication. Other times a nonthermal source dominates. Everyday examples are interference from switched currents in neighboring wires, electrical activity in the radio medium, for example, the ionosphere, and other signals in the same channel. It is also possible that a signal can interfere with itself in such a way that versions of the signal are hard to separate and act like noise to each other. In video transmission, the interfering version is called a ghost, in audio, it is an echo. If there are several interferes, and they lie close in time, the result is a smear in video, a garble in audio.

There are several general rules about noise. First, signals decay as they propagate, so that sooner or later a fixed noise level, however small, becomes significant. Secondly, only the ratio of signal energy E to noise energy N matters, not the absolute value of either. The important ratio is called the signal-to-noise ratio, abbreviated SNR. In communication, this ratio E/N is expressed in decibels (abbreviated dB), which by definition is computed as 10 [log.sub.10)(E/N). We can observe this rule in everyday life. If two people are talking in a room and a party starts up, they must now talk louder; the ratio of their speech to the background noise must stay the same. Almost always, the same rule applies to electrical communication. The implications are that if signals are weak, as they are from outer space, the receiver must be very good, with a low noise; conversely, if noise is high, it may be overcome, but only by large signal power.

1.1.2 Measuring Communication

With symbols, we can simply count them. The situation is actually more subtle, because some symbols can contribute more to the message than others. But more on this must await Chapter 5. Here we want to gain an overall impression of how to measure analog signals, both analog sources of information and the analog signals that transmit digital information.

Important measures of a communication system are its energy, bandwidth, processor requirement, distortion, and delay. In the real world, these all cost money. In general, they all trade off against each other; that is, reducing the cost of one increases the cost of the others. Distortion, for example, can be reduced by consuming more energy or bandwidth, or by more expensive processing or longer transmission delay. There are many other tradeoffs and some are less intuitive; for example, choosing a transmission method with wider bandwidth can lead to a lower signal energy requirement or a lower distortion.

We can look at energy and bandwidth in a little more detail now, saving the full treatment for Chapters 2 and 4. For the moment, think of processing, distortion, and delay as fixed at some convenient level. Energy is a familiar quantity. With digital transmission, it is convenient to think of energy per bit, denoted [E.sub.b] (in joules). If T seconds are devoted to the transmission of each bit, then the power is [E.sub.b]/T watts. With analog waveforms, the power of the waveform is easy to think about. Bandwidth is more difficult. To start, we should point out that there are two meanings of the word. Often in engineering and almost always in the general public, bandwidth refers to the bit rate of a service in bits/second. For example, one might say a video system has higher bandwidth than a voice system, meaning that it requires more bits/s. To communication engineers this is an abuse of notation; bandwidth is the width of frequencies that is required to send the signal. The two meanings are related, but in a complicated way that varies strongly with the modulation method and the quality desired. The simplest digital modulation methods take, very roughly, 1 Hz of bandwidth for each bit per second transmitted. As an example, standard low-tech digitized telephone speech runs at 64 kbits/s, and this therefore would need roughly 64 kHz of bandwidth. As a second example, a short email might convert to 1000 bits, and to send it in 20 s (certainly competitive with the post office!) would imply 50 bits/s for 20 s, and a bandwidth consumption of 50 Hz. The bandwidth of a brief letter is thus very much smaller than sending the same spoken words. As the saying goes, a picture is worth a thousand words. In fact, the engineering reality is that a picture costs the bandwidth of a thousand spoken words, but more like 100,000 written words.

Communication engineers compute bandwidth by the Fourier transform, which is described in Chapter 2. A view that gives some simple intuition is the following water pipe analogy. Transmitting information is like transmitting water. Signal energy E corresponds roughly to the initial pressure of the water. Signal bandwidth W is the size of the pipe. Signal noise N is the friction of the pipe. The total water carried per second is the product of pipe size and pressure, reduced by the effect of friction. Actually, information transfer obeys a relation more like (bandwidth) x log (signal-to-noise ratio), but the water analogy gives the right feel.

It can be seen from the email example above that there are tremendous variations in the parameters of different information sources. Before going further, it would be useful to give rough measures for some of them. These will serve temporarily before we study sources in more detail in Chapter 3.

Messages

These include short emails, paging calls, and orders for a taxi service, for example. Length is perhaps 1000-10,000 bits (200-2000 ASCII characters). Delay of several seconds to several minutes is acceptable. Error rate should be very low, less than [10.sup.-9]. Transmission speed can be as low as 100s of bits/s.

Telephone Speech

As an analog signal, speech has a bandwidth of about 3500 Hz and requires an SNR of 30-40 dB (this is the ratio of signal power to noise power; 30 dB is a factor of 1000). Transmission must be nearly real-time. As a digital signal converted from analog, speech has a bit rate of 64 kbits/s if converted in a simple way, and perhaps 5 kbits/s if converted in a complex way. Bit error rate needs to be in the range [10.sup.-2] to [10.sup.-5].

CD-Quality Music

As a high-quality analog signal, music has a bandwidth of about 22 kHz and needs an SNR of about 90 dB. Delays of up to several seconds can be tolerated in reproduction. As a digital signal converted from analog, standard stereo CD music requires 1.41 Mbits/s. Error rate must be low, less than [10.sup.-9].

Television

As an analog signal, ordinary television has a bandwidth of about 4 MHz and requires an SNR of 20-30 dB. As a digital signal converted from analog, video requires about 40 Mbits/s if converted in a simple way, and perhaps 0.5-2 Mbits/s if converted in a complex way. Delays of up to several seconds can be tolerated in broadcasting and much more in tape playback. An error rate of [10.sup.-5] is tolerable. High-definition television (HDTV) requires 5-10 times the bandwidth and bit rate.

1.2 WHY DIGITAL COMMUNICATION?

The last 30 years has seen a conversion of the entire communication business from analog to digital transmission, that is, from waveforms to symbols. This revolution has been so complete that most of our study in this book is devoted to digital transmission. Why is this?

(Continues...)



Excerpted from Understanding Information Transmission by John B. Anderson Rolf Johannesson Copyright © 2005 by Institute of Electrical and Electronics Engineers, Inc.. Excerpted by permission.
All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.
Excerpts are provided by Dial-A-Book Inc. solely for the personal use of visitors to this web site.

Table of Contents

Preface vii

1. Introduction: First Ideas and Some History 1

1.1 What is communication? 2

1.2 Why digital communication? 6

1.3 Some history 8

1.4 A few remarks on intellectual history 27

1.5 Conclusions 28

References 29

2. Mathematical Methods of Information Transmission: Why Sinusoids? 30

2.1 Linear, time-invariant (LTI) systems 31

2.2 On the importance of being sinusoidal 43

2.3 The Fourier transform 48

2.4 What is bandwidth? 58

2.5 Discrete-time systems 66

2.6 Conclusions 69

References 70

Problems 70

3. Information Sources: What is Out There to be Sent? 77

3.1 What is text? 78

3.2 What is speech? 81

3.3 What is music? 88

3.4 What is an image? 94

3.5 What is video? 98

3.6 Conclusion 102

References 102

Problems 103

4. Transmission Methods: How is Information Sent? 105

4.1 Communication channels 105

4.2 Analog modulation 117

4.3 Digital modulation 127

4.4 FM stereo, television and a little about electronics 139

4.5 Conclusions 146

References 147

Problems 147

5. Information Theory and Coding: What did Shannon Promise? 150

5.1 Information theory—a primer 152

5.2 Methods of source coding 179

5.3 Methods of channel coding 189

5.4 Trellis coded modulation 199

5.5 Conclusions 205

References 206

Problems 207

6. Cryptology: 211

6.1 Fundamentals of cryptosystems 211

6.2 Caesar and Vigenere ciphers 215

6.3 The Vernam cipher and perfect secrecy 219

6.4 Stream ciphers 220

6.5 Block ciphers 223

6.6 Cryptomachines during World War II 224

6.7 Two-key cryptography 228

6.8 Conclusions 239

References 239

Problems 239

7. Communication Networks: Let's Get Connected 241

7.1 An overview of information networks 241

7.2 Circuit switching: The telephone net 252

7.3 Mobile telephony 260

7.4 The Internet 265

References 275

Appendix A: Complex Numbers 276

Appendix B: Sinusoids and Circuit Theory 281

Appendix C: Probability Theory: A Primer 297

Index 306

About the Authors

From the B&N Reads Blog

Customer Reviews