Information Theory: A Tutorial Introduction
Originally developed by Claude Shannon in the 1940s, information theory laid the foundations for the digital revolution, and is now an essential tool in telecommunications, genetics, linguistics, brain sciences, and deep space communication. In this richly illustrated book, accessible examples are used to introduce information theory in terms of everyday games like '20 questions' before more advanced topics are explored. Online MatLab and Python computer programs provide hands-on experience of information theory in action, and PowerPoint slides give support for teaching. Written in an informal style, with a comprehensive glossary and tutorial appendices, this text is an ideal primer for novices who wish to learn the essential principles and applications of information theory.
1121261277
Information Theory: A Tutorial Introduction
Originally developed by Claude Shannon in the 1940s, information theory laid the foundations for the digital revolution, and is now an essential tool in telecommunications, genetics, linguistics, brain sciences, and deep space communication. In this richly illustrated book, accessible examples are used to introduce information theory in terms of everyday games like '20 questions' before more advanced topics are explored. Online MatLab and Python computer programs provide hands-on experience of information theory in action, and PowerPoint slides give support for teaching. Written in an informal style, with a comprehensive glossary and tutorial appendices, this text is an ideal primer for novices who wish to learn the essential principles and applications of information theory.
24.95 In Stock
Information Theory: A Tutorial Introduction

Information Theory: A Tutorial Introduction

by James V Stone
Information Theory: A Tutorial Introduction

Information Theory: A Tutorial Introduction

by James V Stone

Paperback

$24.95 
  • SHIP THIS ITEM
    In stock. Ships in 1-2 days.
  • PICK UP IN STORE

    Your local store may have stock of this item.

Related collections and offers


Overview

Originally developed by Claude Shannon in the 1940s, information theory laid the foundations for the digital revolution, and is now an essential tool in telecommunications, genetics, linguistics, brain sciences, and deep space communication. In this richly illustrated book, accessible examples are used to introduce information theory in terms of everyday games like '20 questions' before more advanced topics are explored. Online MatLab and Python computer programs provide hands-on experience of information theory in action, and PowerPoint slides give support for teaching. Written in an informal style, with a comprehensive glossary and tutorial appendices, this text is an ideal primer for novices who wish to learn the essential principles and applications of information theory.

Product Details

ISBN-13: 9780956372857
Publisher: Tutorial Introductions
Publication date: 02/01/2015
Series: Tutorial Introduction Book
Pages: 260
Product dimensions: 5.90(w) x 8.90(h) x 0.50(d)

About the Author

Visiting Professor, University of Sheffield, UK.

Table of Contents

Preface

1. What is Information?
1.1 Introduction
1.2 Information, Eyes and Evolution
1.3 Finding a Route, Bit By Bit
1.4 A Million Answers to Twenty Questions
1.5 Information, Bits and Binary Digits
1.6 Example 1: Telegraphy
1.7 Example 2: Binary Images
1.8 Example 3: Grey-Level Pictures
1.9 Summary


2. Entropy of Discrete Variables
2.1 Introduction
2.2 Ground Rules and Terminology
2.3 Shannon's Desiderata
2.4 Information, Surprise and Entropy
2.5 Evaluating Entropy
2.6 Properties of Entropy
2.7 Independent and Identically Distributed Values
2.8 Bits, Shannons, and Bans
2.9 Summary

3. The Source Coding Theorem
3.1 Introduction
3.2 Capacity of a Discrete Noiseless Channel
3.3 Shannon's Source Coding Theorem
3.4 Calculating Information Rates
3.5 Data Compression
3.6 The Entropy of English Text
3.7 Why the Theorem is True
3.8 Kolmogorov Complexity
3.7 Summary

4. The Noisy Channel Coding Theorem
4.1 Introduction
4.2 Joint Distributions
4.3 Mutual Information
4.4 Conditional Entropy
4.5 Noise and Cross-Talk
4.6 Noisy Pictures and Coding Efficiency
4.7 Error Correcting Codes
4.8 Capacity of a Noisy Channel
4.9 Shannon's Noisy Channel Coding Theorem
4.10 Why the Theorem is True
4.11 Summary


5. Entropy of Continuous Variables
5.1 Introduction
5.2 The Trouble With Entropy
5.3 Differential Entropy
5.4 Under-Estimating Entropy
5.5 Properties of Differential Entropy
5.6 Maximum Entropy Distributions
5.7 Making Sense of Differential Entropy
5.8 What is Half a Bit of Information?
5.9 Summary

6. Mutual Information: Continuous
6.1 Introduction
6.2 Joint Distributions
6.3 Conditional Distributions and Entropy
6.4 Mutual Information and Conditional Entropy
6.5 Mutual Information is Invariant
6.6 Kullback-Leibler Divergence
6.7 Summary

7. Channel Capacity: Continuous
7.1 Introduction
7.2 Channel Capacity
7.3 The Gaussian Channel
7.4 Error Rates of Noisy Channels
7.5 Using a Gaussian Channel
7.6 Mutual Information and Correlation
7.7 The Fixed Range Channel
7.8 Summary

8. Thermodynamic Entropy and Information
8.1 Introduction
8.2 Physics, Entropy and Disorder
8.3 Information and Thermodynamic Entropy
8.4. Ensembles, Macrostates and Microstates
8.5 Pricing Information: The Landauer Limit
8.6 The Second Law of Thermodynamics
8.7 Maxwell's Demon
8.8 Quantum Computation
8.9 Summary


9. Information As Nature's Currency
9.1 Introduction
9.2 Satellite TVs, MP3 and All That
9.3 Does Sex Accelerate Evolution?
9.4 The Human Genome: How Much Information?
9.5 Are Brains Good At Processing Information?
9.6 A Short History of Information Theory
9.7 Summary

Further Reading

Appendices
A. Glossary
B. Mathematical Symbols
C. Logarithms
D. Probability Density Functions
E. Averages From Distributions
F. The Rules of Probability
G. The Gaussian Distribution
H. Key Equations

References
Index
From the B&N Reads Blog

Customer Reviews