Codes and turbo codes (eBook)

Claude Berrou (Herausgeber)

eBook Download: PDF
2011 | 2010
400 Seiten
Springer Paris (Verlag)
978-2-8178-0039-4 (ISBN)

Lese- und Medienproben

Codes and turbo codes -
Systemvoraussetzungen
96,29 inkl. MwSt
  • Download sofort lieferbar
  • Zahlungsarten anzeigen
This book is devoted to one of the essential functions of modern telecommunications systems: channel coding or error correction coding. Its main topic is iteratively decoded algebraic codes, convolutional codes and concatenated codes.

Title Page 3
Copyright Page 4
Codes and Turbo Codes 5
Foreword 7
Table of Contents 10
Chapter 1 Introduction 15
1.1 Digital messages 17
1.2 A first code 18
1.3 Hard input decoding and soft input decoding 21
1.4 Hard output decoding and soft output decoding 25
1.5 The performance measure 25
1.6 What is a good code? 29
1.7 Families of codes 31
Chapter 2 Digital communications 33
2.1 DigitalModulations 33
2.1.1 Introduction 33
2.1.2 Linear Memoryless Modulations 36
Amplitude-shift keying with M states: M-ASK 36
Phase Shift Keying with M states (M-PSK) 39
Quadrature Amplitude Modulation using two quadrature carriers (MQAM) 40
2.1.3 Memoryless modulation with M states (M-FSK) 43
2.1.4 Modulations with memory by continuous phase frequency shift keying (CPFSK) 45
Continuous phase-frequency-shift keying with modulation index h = 1/2: Minimum Shift Keying (MSK) 46
L-ary Raised Cosine modulation (L-RC) 48
Gaussian minimum shift keying modulation (GMSK) 49
2.2 Structure and performance of the optimal receiver on a Gaussian channel 51
2.2.1 Structure of the coherent receiver 51
2.2.2 Performance of the coherent receiver 56
Amplitude shift keying with M states 56
Phase shift keying with M states 60
Amplitude modulation on two quadrature carriers – M-QAM 63
Frequency shift keying – M-FSK 66
Continuous phase frequency shift keying – CPFSK 69
2.3 Transmission on a band-limited channel 73
2.3.1 Introduction 73
2.3.2 Intersymbol interference 74
2.3.3 Condition of absence of ISI: Nyquist criterion 77
Optimal distribution of filtering between transmission and reception 80
2.3.4 Expression of the error probability in presence of Nyquist filtering 82
2.4 Transmission on fading channels 83
2.4.1 Characterization of a fading channel 83
Coherence bandwidth 84
Coherence time 87
2.4.2 Transmission on non-frequency-selective slow-fading channels 87
Performance on a Rayleigh channel 87
Performance on a Rayleigh channel with diversity 89
Transmission on a slow-fading frequency-selective channel 92
Transmission with equalization at reception 95
Chapter 3 Theoretical limits 96
3.1 Information theory 96
3.1.1 Transmission channel 96
3.1.2 An example: the binary symmetric channel 97
Configurations of errors on the binary symmetric channel 97
Mutual information and capacity of the binary symmetric channel 98
3.1.3 Overview of the fundamental coding theorem 99
3.1.4 Geometrical interpretation 100
3.1.5 Random coding 101
Codes imitating random coding 102
3.2 Theoretical limits to performance 104
3.2.1 Binary input and real output channel 104
3.2.2 Capacity of a transmission channel 105
Shannon limit of a band-limited continuous input and output Gaussian channel 106
Capacity of a discrete input Gaussian channel 106
Capacity of the Rayleigh channel 108
3.3 Practical limits to performance 109
3.3.1 Gaussian binary input channel 109
3.3.2 Gaussian continuous input channel 110
3.3.3 Some examples of limits 112
3.4 Minimum distances required 113
3.4.1 MHD required with 4-PSK modulation 113
3.4.2 MHD required with 8-PSK modulation 115
3.4.3 MHD required with 16-QAM modulation 117
Bibliography 120
Chapter 4 Block codes 121
4.1 Block codes with binary symbols 122
4.1.1 Generator matrix of a binary block code 122
4.1.2 Dual code and parity check matrix 124
4.1.3 Minimum distance 125
4.1.4 Extended codes and shortened codes 126
4.1.5 Product codes 127
4.1.6 Examples of binary block codes 127
Parity check code 127
Repetition code 128
Hamming code 129
Maximum length code 130
Hadamard code 130
Reed-Muller codes 131
4.1.7 Cyclic codes 132
Definition and polynomial representation 132
Cyclic code in systematic form 134
Implementation of an encoder 136
BCH codes 137
4.2 Block codes with non-binary symbols 142
4.2.1 Reed-Solomon codes 142
4.2.2 Implementing the encoder 144
4.3 Decoding and performance of codes with binary symbols 144
4.3.1 Error detection 144
Detection capability 145
Probability of non-detection of errors 145
4.3.2 Error correction 146
Hard decoding 146
Soft decoding 149
4.4 Decoding and performance of codes with non-binary symbols . . 155
4.4.1 Hard input decoding of Reed-Solomon codes 155
4.4.2 Peterson’s directmethod 156
Description of the algorithm for codes with non-binary symbols 156
Simplification of Peterson’s algorithm for binary codes 160
Chien algorithm 162
4.4.3 Iterativemethod 163
Berlekamp-Massey algorithm for codes with non-binary symbols 164
Euclid’s algorithm 166
Calculating coefficients ej by a transform 167
Berlekamp-Massey algorithm for binary cyclic codes 169
Euclid’s algorithm for binary codes 170
4.4.4 Hard input decoding performance of Reed-Solomon codes 171
Bibliography 172
Appendix: Notions about Galois fields and minimal polynomials 173
Definition 173
Primitive element of a Galois field 174
Minimal polynomial with coefficients in F2 associated with an element of a Galois field Fq 174
Minimal polynomial with coefficients in Fq associated with an element in a Galois field Fq 176
Primitive polynomials 176
Chapter 5 Convolutional codes and their decoding 179
5.1 History 179
5.2 Representations of convolutional codes 181
5.2.1 Generic representation of a convolutional encoder 181
5.2.2 Polynomial representation 184
5.2.3 Tree of a code 185
5.2.4 Trellis of a code 185
5.2.5 Statemachine of a code 188
5.3 Code distances and performance 190
5.3.1 Choosing a good code 190
5.3.2 RTZ sequences 190
5.3.3 Transfer function and distance spectrum 192
5.3.4 Performance 195
5.4 Decoding convolutional codes 198
5.4.1 Model of the transmission chain and notations 199
5.4.2 The Viterbi algorithm 199
Example of applying the Viterbi algorithm 202
5.4.3 The Maximum A Posteriori algorithm or MAP algorithm 204
5.5 Convolutional block codes 204
5.5.1 Trellis termination 205
Classical trellis termination 205
Tail-biting 206
5.5.2 Puncturing 208
Bibliography 210
Chapter 6 Concatenated codes 212
6.1 Parallel concatenation and serial concatenation 214
6.2 Parallel concatenation and LDPC codes 217
6.3 Permutations 219
6.4 Turbo crossword 219
Bibliography 222
Chapter 7 Convolutional turbo codes 223
7.1 The history of turbo codes 223
7.2 Multiple concatenation of RSC codes 225
7.3 Turbo codes 227
7.3.1 Termination of constituent codes 231
7.3.2 The permutation function 232
Regular permutation 233
Necessity for disorder 235
Intra-symbol disorder 237
Irregular permutations 239
Quadratic Permutation 241
7.4 Decoding turbo codes 245
7.4.1 Turbo decoding 245
7.4.2 SISO decoding and extrinsic information 248
Notations 249
Decoding following the Maximum A Posteriori (MAP) criterion 250
The simplified Max-Log-MAP or SubMAP algorithm 252
7.4.3 Practical considerations 255
7.5 m-binary turbo codes 259
7.5.1 m-binary RSC encoders 259
7.5.2 m-binary turbo codes 261
8-state double-binary turbo code 263
16-state double-binary turbo code 264
7.6 Analysis tools 266
7.6.1 Theoretical performance 266
7.6.2 Asymptotic behaviour 266
Error impulse method 267
Modified error impulse method 269
Double error impulse method 269
7.6.3 Convergence 269
Transfer function for a SISO decoder of extrinsic information 270
a. Definition of the mutual information (MI) 270
b. Definition of the a priori mutual information 271
c. Calculation of the output mutual information 272
d. Practical method to obtain the transfer function of the extrinsic information 273
e. An example 274
EXIT chart 274
Bibliography 276
Chapter 8 Turbo product codes 281
8.1 History 281
8.2 Product codes 281
Definition 282
Example 8.1 282
8.3 Hard input decoding of product codes 283
8.3.1 Row-column decoding 283
Property 283
Example 8.2 284
8.3.2 The Reddy-Robinson algorithm 284
Example 8.3 286
8.4 Soft input decoding of product codes 287
8.4.1 The Chase algorithm with weighted input 287
Example 8.4 289
8.4.2 Performance of the Chase-Pyndiah algorithm 290
8.4.3 The Fang-Battail algorithm 290
Example 8.5 292
8.4.4 The Hartmann-Nazarov algorithm 295
Fast Hadamard transform 297
Example 8.6 298
8.4.5 Other soft input decoding algorithms 299
8.5 Implantation of the Chase-Pyndiah algorithm 301
Bibliography 303
Chapter 9 LDPC codes 306
9.1 Principle of LDPC codes 306
9.1.1 Parity check code 307
Definition 307
Parity code with three bits 307
Parity check code with n bits 309
9.1.2 Definition of an LDPC code 310
Linear block codes 310
Low density parity check codes 311
Coding rate 312
9.1.3 Encoding 313
Generic encoding 313
Specific constructions 315
Summary 316
Analogy between an LDPC code and a turbo code 316
9.1.4 Decoding LDPC codes 317
Hard input algorithm 318
Belief propagation algorithm 318
9.1.5 Randomconstruction of LDPC codes 321
Optimization of irregularity profiles 321
Optimization of cycle size 323
Selecting the code by the impulse method 323
Selecting the code by simulation 323
9.1.6 Some geometrical constructions of LDPC codes 324
Cayley / Ramanujan constructions 324
Kou, Lin and Fossorier’s Euclidian / Projective Geometry LDPC 325
Constructions based on permutation matrices 325
Matrices based on Pseudo random generators 326
Array-based LDPC 326
BIBDs, Latin rectangles 326
9.2 Architecture for decoding LDPC codes for the Gaussian channel 327
9.2.1 Analysis of the complexity 327
9.2.2 Architecture of a generic node processor (GNP) 328
Choice of a generic operator 331
9.2.3 Generic architecture for message propagation 331
Presentation of the model 331
Example of an implementation 333
9.2.4 Combining parameters of the architecture 334
9.2.5 Example of synthesis of an LDPC decoder architecture . . 337
Flooding schedule (according to parities) 337
Horizontal interleaving schedule 338
9.2.6 Sub-optimal decoding algorithm 339
Single message decoding algorithm (. = 1) 339
Sub-optimal PNP algorithms (. > 1)
9.2.7 Influence of quantization 342
9.2.8 State of the art of published LDPC decoder architectures 344
Bibliography 346
Chapter 10 Turbo codes and large spectral efficiency transmissions 352
10.1 Turbo trellis coded modulation (TTCM) 352
10.2 Pragmatic turbo coded modulation 356
Bibliography 366
Chapter 11 The turbo principle applied to equalization and detection 367
11.1 Turbo equalization 368
11.1.1 Multipath channels and intersymbol interference 368
11.1.2 The equalization function 370
11.1.3 Combining equalization and decoding 374
11.1.4 Principle of turbo equalization 377
11.1.5 MAP turbo equalization 380
Implementation of the BCJR-MAP equalizer 380
Example of performance 384
Complexity of the MAP turbo equalizer and alternative solutions 386
Architectures and applications 387
11.1.6 MMSE turbo equalization 389
Principle of soft-input soft-ouput linear MMSE equalization 390
Adaptive implementation of the equalizer 398
Examples of performance 400
Example of implementation and applications 403
11.2 Multi-user turbo detection and its application to CDMA systems 404
11.2.1 Introduction and some notations 404
11.2.2 Multi-user detection 405
Standard receiver 405
Optimal joint detection 406
Decorrelator detector 407
Linear MMSE detector 407
Iterative detector 408
11.2.3 Turbo CDMA 411
Turbo SIC detector 411
Some simulations 412
Turbo SIC/RAKE detector 412
11.3 Conclusions 413
Bibliography 415
Index 421

Erscheint lt. Verlag 27.1.2011
Reihe/Serie Collection IRIS
Collection IRIS
Zusatzinfo 400 p.
Verlagsort Paris
Sprache englisch
Themenwelt Informatik Theorie / Studium Kryptologie
Mathematik / Informatik Mathematik Angewandte Mathematik
Naturwissenschaften
Technik Elektrotechnik / Energietechnik
Technik Nachrichtentechnik
Schlagworte Algebra • algorithm • algorithms • Coding • coding theory • Communication • detection • Information • Information and Communication, Circuits • Information Theory • Mobile telecommunications • Modulation • Satellite • Signal Processing • telecommunications • Telecommunications satellite • Turbocodes
ISBN-10 2-8178-0039-7 / 2817800397
ISBN-13 978-2-8178-0039-4 / 9782817800394
Haben Sie eine Frage zum Produkt?
PDFPDF (Wasserzeichen)
Größe: 12,8 MB

DRM: Digitales Wasserzeichen
Dieses eBook enthält ein digitales Wasser­zeichen und ist damit für Sie persona­lisiert. Bei einer missbräuch­lichen Weiter­gabe des eBooks an Dritte ist eine Rück­ver­folgung an die Quelle möglich.

Dateiformat: PDF (Portable Document Format)
Mit einem festen Seiten­layout eignet sich die PDF besonders für Fach­bücher mit Spalten, Tabellen und Abbild­ungen. Eine PDF kann auf fast allen Geräten ange­zeigt werden, ist aber für kleine Displays (Smart­phone, eReader) nur einge­schränkt geeignet.

Systemvoraussetzungen:
PC/Mac: Mit einem PC oder Mac können Sie dieses eBook lesen. Sie benötigen dafür einen PDF-Viewer - z.B. den Adobe Reader oder Adobe Digital Editions.
eReader: Dieses eBook kann mit (fast) allen eBook-Readern gelesen werden. Mit dem amazon-Kindle ist es aber nicht kompatibel.
Smartphone/Tablet: Egal ob Apple oder Android, dieses eBook können Sie lesen. Sie benötigen dafür einen PDF-Viewer - z.B. die kostenlose Adobe Digital Editions-App.

Buying eBooks from abroad
For tax law reasons we can sell eBooks just within Germany and Switzerland. Regrettably we cannot fulfill eBook-orders from other countries.

Mehr entdecken
aus dem Bereich
Kryptographie und Geschichte

von Wolfgang Killmann; Winfried Stephan

eBook Download (2024)
Springer Berlin Heidelberg (Verlag)
39,99