The Mathematics of Coding Theory / Edition 1

Paperback (Print)
Buy New
Buy New from BN.com
$83.41
Used and New from Other Sellers
Used and New from Other Sellers
from $8.60
Usually ships in 1-2 business days
(Save 90%)
Other sellers (Paperback)
  • All (10) from $8.60   
  • New (3) from $33.37   
  • Used (7) from $8.60   

Overview

This book makes a very accessible introduction to a very important contemporary application of number theory, abstract algebra, and probability. It contains numerous computational examples throughout, giving learners the opportunity to apply, practice, and check their understanding of key concepts. KEY TOPICS Coverage starts from scratch in treating probability, entropy, compression, Shannon┬┐s theorems, cyclic redundancy checks, and error-correction. For enthusiasts of abstract algebra and number theory.

Read More Show Less

Product Details

  • ISBN-13: 9780131019676
  • Publisher: Pearson
  • Publication date: 11/24/2003
  • Edition description: New Edition
  • Edition number: 1
  • Pages: 408
  • Product dimensions: 7.04 (w) x 9.40 (h) x 0.85 (d)

Read an Excerpt

This book is intended to be accessible to undergraduate students with two years of typical mathematics experience, most likely meaning calculus with a little linear algebra and differential equations. Thus, specifically, there is no assumption of a background in abstract algebra or number theory, nor of probability, nor of linear algebra. All these things are introduced and developed to a degree sufficient to address the issues at hand.

We will address the fundamental problem of transmitting information effectively and accurately. The specific mode of transmission does not really play a role in our discussion. On the other hand, we should mention that the importance of the issues of efficiency and accuracy has increased largely due to the advent of the internet and, even more so, due to the rapid development of wireless communications. For this reason it makes sense to think of networked computers or wireless devices as archetypical fundamental practical examples.

The underlying concepts of information and information content of data make sense independently of computers, and are relevant in looking at the operation of natural languages such as English, and of other modes of operation by which people acquire and process data.

The issue of efficiency is the obvious one: transmitting information costs time, money, and bandwidth. It is important to use as little as possible of each of these resources. Data compression is one way to pursue this efficiency. Some well known examples of compression schemes are commonly used for graphics: GIFs, JPEGs, and more recently PNGs. These clever file format schemes are enormously more efficient in terms of filesize than straightforward bitmap descriptions of graphics files. There are also general-purpose compression schemes, such as gzip, bzip2, ZIP, etc.

The issue of accuracy is addressed by detection and correction of errors that occur during transmission or storage of data. The single most important practical example is the TCP/IP protocol, widely used on the internet: one basic aspect of this is that if any of the packets composing a message is discovered to be mangled or lost, the packet is simply retransmitted. The detection of lost packets is based on numbering the collection making up a given message. The detection of mangled packets is by use of 16-bit checksums in the headers of IP and TCP packets. We will not worry about the technical details of TCP/IP here, but only note that email and many other types of internet traffic depend upon this protocol, which makes essential use of rudimentary error-detection devices.

And it is a fact of life that dust settles on CD-ROMs, static permeates network lines, etc. That is, there is noise in all communication systems. Human natural languages have evolved to include sufficient redundancy so that usually much less than 100% of a message need be received to be properly understood. Such redundancy must be designed into CD-ROM and other data storage protocols to achieve similar robustness.

There are other uses for detection of changes in data: if the data in question is the operating system of your computer, a change not initiated by you is probably a sign of something bad, either failure in hardware or software, or intrusion by hostile agents (whether software or wetware). Therefore, an important component of systems security is implementation of a suitable procedure to detect alterations in critical files.

In pre-internet times, various schemes were used to reduce the bulk of communication without losing the content: this influenced the design of the telegraphic alphabet, traffic lights, shorthand, etc. With the advent of the telephone and radio, these matters became even more significant. Communication with exploratory spacecraft having very limited resources available in deep space is a dramatic example of how the need for efficient and accurate transmission of information has increased in our recent history.

In this course we will begin with the model of communication and information made explicit by Claude Shannon in the 1940's, after some preliminary forays by Hartley and others in the preceding decades.

Many things are omitted due to lack of space and time. In spite of their tremendous importance, we do not mention convolutional codes at all. This is partly because there is less known about them mathematically. Concatenated codes are mentioned only briefly. Finally, we also omit any discussion of the so-called turbo codes. Turbo codes have been recently developed experimentally. Their remarkably good behavior, seemingly approaching the Shannon bound, has led to the conjecture that they are explicit solutions to the fifty-year old existence results of Shannon. However, at this time there is insufficient understanding of the reasons for their good behavior, and for this reason we will not attempt to study them here. We do give a very brief introduction to geometric Goppa codes, attached to algebraic curves, which are a natural generalization of Reed-Solomon codes (which we discuss), and which exceed the Gilbert-Varshamov lower bound for performance.

The exercises at the ends of the chapters are mostly routine, with a few more difficult exercises indicated by single or double asterisks. Short answers are given at the end of the book for a good fraction of the exercises, indicated by '(ans.)' following the exercise.

I offer my sincere thanks to the reviewers of the notes that became this volume. They found many unfortunate errors, and offered many good ideas about improvements to the text. While 1 did not choose to take absolutely all the advice given, I greatly appreciate the thought and energy these people put into their reviews: John Bowman, University of Alberta; Sergio Lopez, Ohio University; Navin Kashyap, University of California, San Diego; James Osterburg, University of Cincinnati; LeRoy Bearnson, Brigham Young University; David Grant, University of Colorado at Boulder; Jose Voloch, University of Texas.

Paul Garrett

Read More Show Less

Table of Contents

1. Probability.

Counting. Preliminary Ideas of Probability. More Formal View of Probability. Random Variables, Expected Values, Variance. Markov Inequality, Chebycheff Inequality. Law of Large Numbers.

2. Information and Entropy.

Uncertainty, Acquisition of Information. Entropy. Uniquely-Decipherable and Prefix Codes. Kraft and Macmillan Inequalities.

3. Noiseless Coding.

Noiseless Coding Theorem. Huffman Coding.

4. Noisy Coding.

Noisy channels. Example: Parity Checks. Decoding from a Noisy Channel. Channel Capacity. Noisy Coding Theorem.

5. Cyclic Redundancy Checks.

The Finite Field GF(2) with 2 Elements. Polynomials over GF(2). Cyclic Redundancy Checks (CRC's). What Errors Does a CRC Catch?

6. The Integers.

Reduction Algorithm. Divisibility. Factorization into Primes. Euclidean Algorithm. Integers Modulo M. The Finite Field Z/P for P Prime. Fermat's Little Theorem. Primitive Roots. Euler's Criterion. Fast Modular Exponentiation.

7. Finite Fields.

Making Fields. Examples of Field Extensions. Addition Modulo P. Multiplication Modulo P. Multiplicative Inverses Modulo P. Primitive Roots.

8. Polynomials.

Polynomials with Coefficients in a Field. Divisibility. Factoring and Irreducibility. Euclidean Algorithm. Unique Factorization.

9. Introduction to Linear Codes.

An Ugly Example. The Hamming Binary [7,4] Code. Some Linear Algebra. A Review of Row Reduction. Definition: Linear Codes. Syndrome Decoding. Berlekamp's Algorithm.

10. Bounds for Codes.

Hamming (Sphere-Packing) Bound. Gilbert-Varshamov Bound. Singleton Bound.

11. Cyclic Codes.

Minimum Distance in Linear Codes. Cyclic Codes.

12. Primitive Roots.

Characteristics of Fields. Multiple Factors in Polynomials. Cyclotomic Polynomials. Primitive Roots in Finite Fields. Primitive Roots Modulo Prime Powers. Counting Primitive Roots. Non-Existence. An Algorithm to Find Primitive Roots.

13. Primitive Polynomials.

Definitions. Examples Modulo 2. Testing for Primitivity. Example: Periods of LFSR's. Example: Two-Bit Errors Detected by CRC's.

14. Basic Linear Codes.

Vandermonde Determinants. More Check Matrices for Cyclic Codes. RS Codes. Hamming Codes (Again). BCH Codes. Decoding BCH Codes.

15. Concatenated Codes.

Mirage Codes. Concatenated Codes. Justesen Codes. Some Explicit Irreducible Polynomials.

16. Curves and Codes.

Plane Curves. Singularities of Curves. Projective Plane Curves. Curves in Higher Dimensions. Genus, Divisors, Linear Systems. Geometric Goppa Codes. Tsfasman-Vladut-Zink Bound.

Appendices.

Sets and functions. Equivalence Relations. Stirling's Formula.

Bibliography.

Index.

Read More Show Less

Preface

This book is intended to be accessible to undergraduate students with two years of typical mathematics experience, most likely meaning calculus with a little linear algebra and differential equations. Thus, specifically, there is no assumption of a background in abstract algebra or number theory, nor of probability, nor of linear algebra. All these things are introduced and developed to a degree sufficient to address the issues at hand.

We will address the fundamental problem of transmitting information effectively and accurately. The specific mode of transmission does not really play a role in our discussion. On the other hand, we should mention that the importance of the issues of efficiency and accuracy has increased largely due to the advent of the internet and, even more so, due to the rapid development of wireless communications. For this reason it makes sense to think of networked computers or wireless devices as archetypical fundamental practical examples.

The underlying concepts of information and information content of data make sense independently of computers, and are relevant in looking at the operation of natural languages such as English, and of other modes of operation by which people acquire and process data.

The issue of efficiency is the obvious one: transmitting information costs time, money, and bandwidth. It is important to use as little as possible of each of these resources. Data compression is one way to pursue this efficiency. Some well known examples of compression schemes are commonly used for graphics: GIFs, JPEGs, and more recently PNGs. These clever file format schemes are enormously more efficient in terms of filesize than straightforward bitmap descriptions of graphics files. There are also general-purpose compression schemes, such as gzip, bzip2, ZIP, etc.

The issue of accuracy is addressed by detection and correction of errors that occur during transmission or storage of data. The single most important practical example is the TCP/IP protocol, widely used on the internet: one basic aspect of this is that if any of the packets composing a message is discovered to be mangled or lost, the packet is simply retransmitted. The detection of lost packets is based on numbering the collection making up a given message. The detection of mangled packets is by use of 16-bit checksums in the headers of IP and TCP packets. We will not worry about the technical details of TCP/IP here, but only note that email and many other types of internet traffic depend upon this protocol, which makes essential use of rudimentary error-detection devices.

And it is a fact of life that dust settles on CD-ROMs, static permeates network lines, etc. That is, there is noise in all communication systems. Human natural languages have evolved to include sufficient redundancy so that usually much less than 100% of a message need be received to be properly understood. Such redundancy must be designed into CD-ROM and other data storage protocols to achieve similar robustness.

There are other uses for detection of changes in data: if the data in question is the operating system of your computer, a change not initiated by you is probably a sign of something bad, either failure in hardware or software, or intrusion by hostile agents (whether software or wetware). Therefore, an important component of systems security is implementation of a suitable procedure to detect alterations in critical files.

In pre-internet times, various schemes were used to reduce the bulk of communication without losing the content: this influenced the design of the telegraphic alphabet, traffic lights, shorthand, etc. With the advent of the telephone and radio, these matters became even more significant. Communication with exploratory spacecraft having very limited resources available in deep space is a dramatic example of how the need for efficient and accurate transmission of information has increased in our recent history.

In this course we will begin with the model of communication and information made explicit by Claude Shannon in the 1940's, after some preliminary forays by Hartley and others in the preceding decades.

Many things are omitted due to lack of space and time. In spite of their tremendous importance, we do not mention convolutional codes at all. This is partly because there is less known about them mathematically. Concatenated codes are mentioned only briefly. Finally, we also omit any discussion of the so-called turbo codes. Turbo codes have been recently developed experimentally. Their remarkably good behavior, seemingly approaching the Shannon bound, has led to the conjecture that they are explicit solutions to the fifty-year old existence results of Shannon. However, at this time there is insufficient understanding of the reasons for their good behavior, and for this reason we will not attempt to study them here. We do give a very brief introduction to geometric Goppa codes, attached to algebraic curves, which are a natural generalization of Reed-Solomon codes (which we discuss), and which exceed the Gilbert-Varshamov lower bound for performance.

The exercises at the ends of the chapters are mostly routine, with a few more difficult exercises indicated by single or double asterisks. Short answers are given at the end of the book for a good fraction of the exercises, indicated by '(ans.)' following the exercise.

I offer my sincere thanks to the reviewers of the notes that became this volume. They found many unfortunate errors, and offered many good ideas about improvements to the text. While 1 did not choose to take absolutely all the advice given, I greatly appreciate the thought and energy these people put into their reviews: John Bowman, University of Alberta; Sergio Lopez, Ohio University; Navin Kashyap, University of California, San Diego; James Osterburg, University of Cincinnati; LeRoy Bearnson, Brigham Young University; David Grant, University of Colorado at Boulder; Jose Voloch, University of Texas.

Paul Garrett

Read More Show Less

Customer Reviews

Be the first to write a review
( 0 )
Rating Distribution

5 Star

(0)

4 Star

(0)

3 Star

(0)

2 Star

(0)

1 Star

(0)

Your Rating:

Your Name: Create a Pen Name or

Barnes & Noble.com Review Rules

Our reader reviews allow you to share your comments on titles you liked, or didn't, with others. By submitting an online review, you are representing to Barnes & Noble.com that all information contained in your review is original and accurate in all respects, and that the submission of such content by you and the posting of such content by Barnes & Noble.com does not and will not violate the rights of any third party. Please follow the rules below to help ensure that your review can be posted.

Reviews by Our Customers Under the Age of 13

We highly value and respect everyone's opinion concerning the titles we offer. However, we cannot allow persons under the age of 13 to have accounts at BN.com or to post customer reviews. Please see our Terms of Use for more details.

What to exclude from your review:

Please do not write about reviews, commentary, or information posted on the product page. If you see any errors in the information on the product page, please send us an email.

Reviews should not contain any of the following:

  • - HTML tags, profanity, obscenities, vulgarities, or comments that defame anyone
  • - Time-sensitive information such as tour dates, signings, lectures, etc.
  • - Single-word reviews. Other people will read your review to discover why you liked or didn't like the title. Be descriptive.
  • - Comments focusing on the author or that may ruin the ending for others
  • - Phone numbers, addresses, URLs
  • - Pricing and availability information or alternative ordering information
  • - Advertisements or commercial solicitation

Reminder:

  • - By submitting a review, you grant to Barnes & Noble.com and its sublicensees the royalty-free, perpetual, irrevocable right and license to use the review in accordance with the Barnes & Noble.com Terms of Use.
  • - Barnes & Noble.com reserves the right not to post any review -- particularly those that do not follow the terms and conditions of these Rules. Barnes & Noble.com also reserves the right to remove any review at any time without notice.
  • - See Terms of Use for other conditions and disclaimers.
Search for Products You'd Like to Recommend

Recommend other products that relate to your review. Just search for them below and share!

Create a Pen Name

Your Pen Name is your unique identity on BN.com. It will appear on the reviews you write and other website activities. Your Pen Name cannot be edited, changed or deleted once submitted.

 
Your Pen Name can be any combination of alphanumeric characters (plus - and _), and must be at least two characters long.

Continue Anonymously

    If you find inappropriate content, please report it to Barnes & Noble
    Why is this product inappropriate?
    Comments (optional)