Elements of Information Theory / Edition 2

Hardcover (Print)
Rent
Rent from BN.com
$36.45
(Save 70%)
Est. Return Date: 06/20/2014
Buy New
Buy New from BN.com
$87.15
Buy Used
Buy Used from BN.com
$70.58
(Save 41%)
Item is in good condition but packaging may have signs of shelf wear/aging or torn packaging.
Condition: Used – Good details
Used and New from Other Sellers
Used and New from Other Sellers
from $62.39
Usually ships in 1-2 business days
(Save 48%)
Other sellers (Hardcover)
  • All (17) from $62.39   
  • New (11) from $87.14   
  • Used (6) from $62.39   

Overview

Synchronization in Digital Communications Volume I Phase-, Frequency-Locked Loops, and Amplitude Control Heinrich Meyr and Gerd Ascheid Volume I provides in-depth coverage of the synchronization function and related topics of digital communications systems. The objective of the book is to develop a theoretical framework for synchronization that will enable the reader to derive practical solutions to synchronization problems in the diverse areas of digital communications. It covers the fundamentals of phase, frequency, and amplitude control. An understanding of these functions is necessary to comprehend any synchronization system.
Read More Show Less

Editorial Reviews

Booknews
The authors introduce the basic quantities of entropy, relative entropy, and mutual information and show how they arise as natural answers to questions of data compression, channel capacity, rate distortion, hypothesis testing, information flow in networks, and gambling. Accessible to students of communication theory, computer science, and statistics. Annotation c. Book News, Inc., Portland, OR (booknews.com)
From the Publisher
"As expected, the quality of exposition continues to be a high point of the book. Clear explanations, nice graphical illustrations, and illuminating mathematical derivations make the book particularly useful as a textbook on information theory." (Journal of the American Statistical Association, March 2008)

"This book is recommended reading, both as a textbook and as a reference." (Computing Reviews.com, December 28, 2006)

Read More Show Less

Product Details

  • ISBN-13: 9780471241959
  • Publisher: Wiley, John & Sons, Incorporated
  • Publication date: 7/18/2006
  • Series: Wiley Series in Telecommunications and Signal Processing , #40
  • Edition description: Revised Edition
  • Edition number: 2
  • Pages: 776
  • Sales rank: 728,177
  • Product dimensions: 6.16 (w) x 9.49 (h) x 1.70 (d)

Meet the Author

Thomas M. Cover is Professor jointly in the Departments of Electrical Engineering and Statistics at Stanford University. He is past President of the IEEE Information Theory Society and is a Fellow of the Institute for Mathematical Statistics and of the IEEE. In 1972 he received the Outstanding Paper Award in Information Theory for his paper Broadcast Channels; and he was selected in 1990 as the Shannon Lecturer, regarded as the highest honor in information theory. Author of over 90 technical papers, he is coeditor of the book Open Problems in Communication and Computation. Professor Cover has devoted the last 20 years to developing the relationship between information theory and statistics. He received his PhD in electrical engineering from Stanford University.

Joy A. Thomas is working at the IBM T J. Watson Research Center in Hawthorne, New York. He received the IEEE Charles LeGeyt Fortescue Fellowship for 1984-85 and the IBM Graduate Fellowship for 1987-90. He received his BTech in electrical engineering at the Indian Institute of Technology Madras, India and his PhD in electrical engineering at Stanford University.

Read More Show Less

Read an Excerpt

Chapter 1: Introduction and Preview

This "first and last lecture" chapter goes backwards and forwards through information theory and its naturally related ideas. The full definitions and study of the subject begin in Chapter 2.

Information theory answers two fundamental questions in communication theory: what is the ultimate data compression (answer: the entropy H), and what is the ultimate transmission rate of communication (answer: the channel capacity C). For this reason some consider information theory to be a subset of communication theory. We will argue that it is much more. Indeed, it has fundamental contributions to make in statistical physics (thermodynamics), computer science (Kolmogorov complexity or algorithmic complexity), statistical inference (Occam's Razor: "The simplest explanation is best") and to probability and statistics (error rates for optimal hypothesis testing and estimation).

Figure 1.1 illustrates the relationship of information theory to other fields. As the figure suggests, information theory intersects physics (statistical mechanics), mathematics (probability theory), electrical engineering (communication theory) and computer science (algorithmic complexity). We now describe the areas of intersection in greater detail:

Electrical Engineering (Communication Theory). In the early 1940s, it was thought that increasing the transmission rate of information over a communication channel increased the probability of error. Shannon surprised the communication theory community by proving that this was not true as long as the communication rate was below channel capacity. The capacity can be simply computed from the noise characteristics of the channel. Shannon further argued that random processes such as music and speech have an irreducible complexity below which the signal cannot be compressed. This he named the entropy, in deference to the parallel use of this word in thermodynamics, and argued that if the entropy of the source is less than the capacity of the channel, then asymptotically error free communication can be achieved.

Information theory today represents the extreme points of the set of all possible communication schemes, as shown in the fanciful Figure 1.2. The data compression minimum I(X; X) lies at one extreme of the set of communication ideas. All data compression schemes require description rates at least equal to this minimum. At the other extreme is the data transmission maximum I(X; Y), known as the channel capacity. Thus all modulation schemes and data compression schemes lie between these limits.

Information theory also suggests means of achieving these ultimate limits of communication. However, these theoretically optimal communication schemes, beautiful as they are, may turn out to be computationally impractical. It is only because of the computational feasibility of simple modulation and demodulation schemes that we use them rather than the random coding and nearest neighbor decoding rule suggested by Shannon's proof of the channel capacity theorem. Progress in integrated circuits and code design has enabled us to reap some of the gains suggested by Shannon's theory. A good example of an application of the ideas of information theory is the use of error correcting codes on compact discs.

Modern work on the communication aspects of information theory has concentrated on network information theory: the theory of the simultaneous rates of communication from many senders to many receivers in a communication network. Some of the trade-offs of rates between senders and receivers are unexpected, and all have a certain mathematical simplicity. A unifying theory, however, remains to be found.

Computer Science (Kolmogorov Complexity). Kolmogorov, Chaitin and Solomonoff put forth the idea that the complexity of a string of data can be defined by the length of the shortest binary program for computing the string. Thus the complexity is the minimal description length. This definition of complexity turns out to be universal, that is, computer independent, and is of fundamental importance. Thus Kolmogorov complexity lays the foundation for the theory of descriptive complexity. Gratifyingly, the Kolmogorov complexity K is approximately equal to the Shannon entropy H if the sequence is drawn at random from a distribution that has entropy H. So the tie-in between information theory and Kolmogorov complexity is perfect. Indeed, we consider Kolmogorov complexity to be more fundamental than Shannon entropy. It is the ultimate data compression and leads to a logically consistent procedure for inference.

There is a pleasing complementary relationship between algorithmic complexity and computational complexity. One can think about computational complexity (time complexity) and Kolmogorov complexity (program length or descriptive complexity) as two axes corresponding to program running time and program length. Kolmogorov complexity focuses on minimizing along the second axis, and computational complexity focuses on minimizing along the first axis. Little work has been done on the simultaneous minimization of the two...

Read More Show Less

Table of Contents

1 Introduction and preview 1
2 Entropy, relative entropy, and mutual information 13
3 Asymptotic equipartition property 57
4 Entropy rates of a stochastic process 71
5 Data compression 103
6 Gambling and data compression 159
7 Channel capacity 183
8 Differential entropy 243
9 Gaussian channel 261
10 Rate distortion theory 301
11 Information theory and statistics 347
12 Maximum entropy 409
13 Universal source coding 427
14 Kolmogorov complexity 463
15 Network information theory 509
16 Information theory and portfolio theory 613
17 Inequalities in information theory 657
Read More Show Less

Customer Reviews

Be the first to write a review
( 0 )
Rating Distribution

5 Star

(0)

4 Star

(0)

3 Star

(0)

2 Star

(0)

1 Star

(0)

Your Rating:

Your Name: Create a Pen Name or

Barnes & Noble.com Review Rules

Our reader reviews allow you to share your comments on titles you liked, or didn't, with others. By submitting an online review, you are representing to Barnes & Noble.com that all information contained in your review is original and accurate in all respects, and that the submission of such content by you and the posting of such content by Barnes & Noble.com does not and will not violate the rights of any third party. Please follow the rules below to help ensure that your review can be posted.

Reviews by Our Customers Under the Age of 13

We highly value and respect everyone's opinion concerning the titles we offer. However, we cannot allow persons under the age of 13 to have accounts at BN.com or to post customer reviews. Please see our Terms of Use for more details.

What to exclude from your review:

Please do not write about reviews, commentary, or information posted on the product page. If you see any errors in the information on the product page, please send us an email.

Reviews should not contain any of the following:

  • - HTML tags, profanity, obscenities, vulgarities, or comments that defame anyone
  • - Time-sensitive information such as tour dates, signings, lectures, etc.
  • - Single-word reviews. Other people will read your review to discover why you liked or didn't like the title. Be descriptive.
  • - Comments focusing on the author or that may ruin the ending for others
  • - Phone numbers, addresses, URLs
  • - Pricing and availability information or alternative ordering information
  • - Advertisements or commercial solicitation

Reminder:

  • - By submitting a review, you grant to Barnes & Noble.com and its sublicensees the royalty-free, perpetual, irrevocable right and license to use the review in accordance with the Barnes & Noble.com Terms of Use.
  • - Barnes & Noble.com reserves the right not to post any review -- particularly those that do not follow the terms and conditions of these Rules. Barnes & Noble.com also reserves the right to remove any review at any time without notice.
  • - See Terms of Use for other conditions and disclaimers.
Search for Products You'd Like to Recommend

Recommend other products that relate to your review. Just search for them below and share!

Create a Pen Name

Your Pen Name is your unique identity on BN.com. It will appear on the reviews you write and other website activities. Your Pen Name cannot be edited, changed or deleted once submitted.

 
Your Pen Name can be any combination of alphanumeric characters (plus - and _), and must be at least two characters long.

Continue Anonymously
Sort by: Showing 1 Customer Reviews
  • Anonymous

    Posted February 4, 2007

    terrible

    This book is horrible. Theories are given, but no examples are provided. Please do not make students' life worst than it is now.

    Was this review helpful? Yes  No   Report this review
Sort by: Showing 1 Customer Reviews

If you find inappropriate content, please report it to Barnes & Noble
Why is this product inappropriate?
Comments (optional)