Iterative Error Correction: Turbo, Low-Density Parity-Check and Repeat-Accumulate Codes

Iterative Error Correction: Turbo, Low-Density Parity-Check and Repeat-Accumulate Codes

by Sarah J. Johnson
     
 

Presents all of the key ideas needed to understand, design, implement and analyse iterative-based error correction schemes.See more details below

Overview

Presents all of the key ideas needed to understand, design, implement and analyse iterative-based error correction schemes.

Product Details

ISBN-13:
9780521871488
Publisher:
Cambridge University Press
Publication date:
12/31/2009
Pages:
356
Product dimensions:
6.90(w) x 9.80(h) x 0.90(d)

Table of Contents

Preface xi

Notation xv

Commonly used abbreviations xix

1 Channels, codes and capacity 1

1.1 Binary input memoryless channels 2

1.2 Entropy, mutual information and capacity 8

1.2.1 A measure of information 8

1.2.2 Entropy 10

1.2.3 Capacity 13

1.3 Codes and decoding 16

1.3.1 Decoding 18

1.3.2 Performance measures 24

1.4 Bibliographic notes 29

1.5 Exercises 30

2 Low-density parity-check codes 34

2.1 Introduction 34

2.2 Error correction using parity checks 34

2.2.1 Low-density parity-check codes 37

2.3 Encoding 40

2.3.1 (Almost) linear-time encoding for LDPC codes 44

2.3.2 Repeat-accumulate codes 48

2.4 Decoding 48

2.4.1 Message passing on the binary erasure channel 50

2.4.2 Bit-flipping decoding 54

2.4.3 Sum-product decoding 61

2.5 Bibliographic notes 71

2.6 Exercises 72

3 Low-density parity-check codes: properties and constructions 75

3.1 Introduction 75

3.2 LDPC properties 75

3.2.1 Choosing the degree distribution 76

3.2.2 Girth and expansion 82

3.2.3 Codeword and pseudo-codeword weights 84

3.2.4 Cyclic and quasi-cyclic codes 89

3.2.5 Implementation and complexity 95

3.3 LDPC constructions 97

3.3.1 Pseudo-random constructions 97

3.3.2 Structured LDPC codes 105

3.4 LDPC design rules 114

3.5 Bibliographic notes 117

3.6 Exercises 118

4 Convolutional codes 121

4.1 Introduction 121

4.2 Convolutional encoders 121

4.2.1 Memory order and impulse response 125

4.2.2 Recursive convolutional encoders 127

4.2.3 Puncturing 131

4.2.4 Convolutional code trellises 132

4.2.5 Minimum free distance 135

4.3 Decoding convolutional codes 136

4.3.1 Maximum a posteriori (BCJR) decoding 136

4.3.2 Log MAP decoding 148

4.3.3 Maximum likelihood (Viterbi) decoding 156

4.4 Bibliographic notes 160

4.5 Exercises 161

5 Turbo codes 165

5.1 Introduction 165

5.2 Turbo encoders 165

5.3 Iterative turbo decoding 169

5.4 Turbo code design 176

5.4.1 Interleaving gain 176

5.4.2 Choosing the component codes 178

5.4.3 Interleaver design 182

5.4.4 Design rules 188

5.5 Factor graphs and implementation 190

5.5.1 Implementation and complexity 191

5.5.2 Stopping criteria 194

5.6 Bibliographic notes 196

5.7 Exercises 198

6 Serial concatenation and RA codes 201

6.1 Serial concatenation 201

6.1.1 Serially concatenated turbo codes 203

6.1.2 Turbo decoding of SC codes 203

6.1.3 Code design 207

6.2 Repeat-accumulate codes 209

6.2.1 Encoding RA codes 211

6.2.2 Turbo decoding of RA codes 212

6.2.3 Sum-product decoding of RA codes 217

6.2.4 Irregular RA codes 220

6.2.5 Weight-3 IRA codes 224

6.2.6 Accumulate-repeat-accumulate codes 226

6.2.7 Code design 227

6.3 Bibliographic notes 232

6.4 Exercises 234

7 Density evolution and EXIT charts 237

7.1 Introduction 237

7.2 Density evolution 238

7.2.1 Density evolution on the BEC 239

7.2.2 Ensemble thresholds 243

7.2.3 Density evolution and repeat-accumulate codes 247

7.2.4 Density evolution on general binary input memoryless channels 249

7.2.5 Density evolution and turbo codes 254

7.2.6 Designing ensembles with good thresholds 256

7.2.7 Approximations to density evolution 260

7.3 EXIT charts 261

7.3.1 Mutual information 262

7.3.2 EXIT charts for turbo codes 264

7.3.3 EXIT charts for RA codes 273

7.3.4 EXIT charts for LDPC codes 276

7.3.5 Code design and analysis using EXIT charts 279

7.4 Bibliographic notes 279

7.5 Exercises 281

8 Error floor analysis 283

8.1 Introduction 283

8.2 Maximum likelihood analysis 283

8.2.1 Input-output weight enumerating functions for convolutional codes 285

8.2.2 Parallel concatenated code ensembles 290

8.2.3 Serially concatenated code ensembles 296

8.2.4 The error floor of irregular block codes 301

8.2.5 Designing iterative codes to improve the interleaving gain and error floor 305

8.2.6 Asymptotic (in the code length) performance of iterative ensembles 306

8.3 Finite-length analysis 309

8.4 Bibliographic notes 318

8.5 Exercises 320

References 322

Index 331

Read More

Customer Reviews

Average Review:

Write a Review

and post it to your social network

     

Most Helpful Customer Reviews

See all customer reviews >