Entropy and Information Theory

Overview

This book is an updated version of the information theory classic, first published in 1990. About one-third of the book is devoted to Shannon source and channel coding theorems; the remainder addresses sources, channels, and codes and on information and distortion measures and their properties.
New in this edition:
Expanded treatment of stationary or sliding-block codes and their relations to traditional block codesExpanded discussion of ...

See more details below
Hardcover (2nd ed. 2011)
$103.60
BN.com price
(Save 4%)$109.00 List Price
Other sellers (Hardcover)
  • All (7) from $87.62   
  • New (4) from $87.62   
  • Used (3) from $126.74   
Sending request ...

Overview

This book is an updated version of the information theory classic, first published in 1990. About one-third of the book is devoted to Shannon source and channel coding theorems; the remainder addresses sources, channels, and codes and on information and distortion measures and their properties.
New in this edition:
Expanded treatment of stationary or sliding-block codes and their relations to traditional block codesExpanded discussion of results from ergodic theory relevant to information theoryExpanded treatment of B-processes — processes formed by stationary coding memoryless sourcesNew material on trading off information and distortion, including the Marton inequalityNew material on the properties of optimal and asymptotically optimal source codesNew material on the relationships of source coding and rate-constrained simulation or modeling of random processesSignificant material not covered in other information theory texts includes stationary/sliding-block codes, a geometric view of information theory provided by process distance measures, and general Shannon coding theorems for asymptotic mean stationary sources, which may be neither ergodic nor stationary, and d-bar continuous channels.

Read More Show Less

Editorial Reviews

From the Publisher
From the reviews of the second edition:
“In Entropy and Information Theory Robert Gray offers an excellent text to stimulate research in this field. … Entropy and Information Theory is highly recommended as essential reading to academics and researchers in the field, especially to engineers interested in the mathematical aspects and mathematicians interested in the engineering applications. … it will contribute to further synergy between the two fields and the deepening of research efforts.” (Ina Fourie, Online Information Review, Vol. 36 (3), 2012)
“The book offers interesting and very important information about the theory of probabilistic information measures and their application to coding theorems for information sources and noisy channels. The main goal is a general development of Shannon’s mathematical theory of communication for single-user systems. … The author manages to balance the practice with the theory, every chapter is very well structured and has high-value content.” (Nicolae Constantinescu, Zentralblatt MATH, Vol. 1216, 2011)
Booknews
On the theory of probabilistic information measures and their application to coding theorems for general information sources, noisy channels, and block and sliding block codes. The goal is a general development of Shannon's mathematical theory of communication, but much of the book is devoted to the tools and methods required to prove the Shannon coding theorems. Annotation c. Book News, Inc., Portland, OR booknews.com
Read More Show Less

Product Details

  • ISBN-13: 9781441979698
  • Publisher: Springer US
  • Publication date: 2/3/2011
  • Edition description: 2nd ed. 2011
  • Edition number: 2
  • Pages: 410
  • Product dimensions: 6.40 (w) x 9.30 (h) x 1.30 (d)

Meet the Author

Robert M. Gray is the Alcatel-Lucent Technologies Professor of Communications and Networking in the School of Engineering and Professor of Electrical Engineering at Stanford University. For over four decades he has done research, taught, and published in the areas of information theory and statistical signal processing. He is a Fellow of the IEEE and the Institute for Mathematical Statistics. He has won several professional awards, including a Guggenheim Fellowship, the Society Award and Education Award of the IEEE Signal Processing Society, the Claude E. Shannon Award from the IEEE Information Theory Society, the Jack S. Kilby Signal Processing Medal, Centennial Medal, and Third Millennium Medal from the IEEE, and a Presidential Award for Excellence in Science, Mathematics and Engineering Mentoring (PAESMEM). He is a member of the National Academy of Engineering.

Read More Show Less

Table of Contents

Preface.- Introduction.- Information Sources.- Pair Processes: Channels, Codes, and Couplings.- Entropy.- The Entropy Ergodic Theorem.- Distortion and Approximation.- Distortion and Entropy.- Relative Entropy.- Information Rates.- Distortion vs. Rate.- Relative Entropy Rates.- Ergodic Theorems for Densities.- Source Coding Theorems.- Coding for Noisy Channels.- Bibliography.- References.- Index

Read More Show Less

Customer Reviews

Be the first to write a review
( 0 )
Rating Distribution

5 Star

(0)

4 Star

(0)

3 Star

(0)

2 Star

(0)

1 Star

(0)

Your Rating:

Your Name: Create a Pen Name or

Barnes & Noble.com Review Rules

Our reader reviews allow you to share your comments on titles you liked, or didn't, with others. By submitting an online review, you are representing to Barnes & Noble.com that all information contained in your review is original and accurate in all respects, and that the submission of such content by you and the posting of such content by Barnes & Noble.com does not and will not violate the rights of any third party. Please follow the rules below to help ensure that your review can be posted.

Reviews by Our Customers Under the Age of 13

We highly value and respect everyone's opinion concerning the titles we offer. However, we cannot allow persons under the age of 13 to have accounts at BN.com or to post customer reviews. Please see our Terms of Use for more details.

What to exclude from your review:

Please do not write about reviews, commentary, or information posted on the product page. If you see any errors in the information on the product page, please send us an email.

Reviews should not contain any of the following:

  • - HTML tags, profanity, obscenities, vulgarities, or comments that defame anyone
  • - Time-sensitive information such as tour dates, signings, lectures, etc.
  • - Single-word reviews. Other people will read your review to discover why you liked or didn't like the title. Be descriptive.
  • - Comments focusing on the author or that may ruin the ending for others
  • - Phone numbers, addresses, URLs
  • - Pricing and availability information or alternative ordering information
  • - Advertisements or commercial solicitation

Reminder:

  • - By submitting a review, you grant to Barnes & Noble.com and its sublicensees the royalty-free, perpetual, irrevocable right and license to use the review in accordance with the Barnes & Noble.com Terms of Use.
  • - Barnes & Noble.com reserves the right not to post any review -- particularly those that do not follow the terms and conditions of these Rules. Barnes & Noble.com also reserves the right to remove any review at any time without notice.
  • - See Terms of Use for other conditions and disclaimers.
Search for Products You'd Like to Recommend

Recommend other products that relate to your review. Just search for them below and share!

Create a Pen Name

Your Pen Name is your unique identity on BN.com. It will appear on the reviews you write and other website activities. Your Pen Name cannot be edited, changed or deleted once submitted.

 
Your Pen Name can be any combination of alphanumeric characters (plus - and _), and must be at least two characters long.

Continue Anonymously

    If you find inappropriate content, please report it to Barnes & Noble
    Why is this product inappropriate?
    Comments (optional)