Turbo Codes
Turbo Codes
Turbo Codes
Submitted by: Sharad Kaushik(03714802809) Shashank Vaish(05714802809) Pratik Gauri(11014802809) Abhishek Jain(13314802809)
Objective
Error Correction Codes Turbo Codes Technology Turbo decoding Turbo Codes Performance Turbo Coding Application
Introduction
Motivation
Can we have error free communication as much as possible. Can we reach Shannon Limit?
Studying channel coding Understanding channel capacity Ways to increase data rate Provide reliable communication link
Objectives
Shannon Theory
For every combination of bandwidth (W), channel type, signal power (S) and received noise power (N), there is a theoretical upper limit on the data transmission rate R, for which error-free data transmission is possible. This limit is called channel capacity or also Shannon capacity.
Shannon Theory
sets a limit to the energy efficiency of a code. Started the information Theory Stated the max data rate of a channel Error rate Power
Backward Error correction Error detection capability Communication cost Real time traffic Forward Error Correction Detection and correction of errors More complex receivers DSP cost
Block Codes Data split into blocks Checks are within the block Convolution code Bit streamed data Involves memory Turbo codes Uses convolution Codes Special properties
Coding advantages
Pn 10-3
Coding gain
10-8
19
Eb/N0 dB
Coding disadvantages
IEEE International Comm. conference 1993 in Geneva Berrou, Glavieux. : Near Shannon Limit Error-Correcting Coding : Turbo codes Provided virtually error free communication at data date/power efficiencies beyond most expert though
Double data throughput at a given power Or work with half the power The two men were not known, most were thinking that they are wrong in calculation They realized that it was true. Many companies adopted, new compnaies started: turboconcept and iCoding 0.5 dB from Shannon limit at Pe 10-6
Turbo codes
Parallel concatenated
The k-bit block is encoded N times with different versions (order) Pro the sequence remains RTZ is 1/2Nv Randomness with 2 encoders; error pro of 10-5 Permutations are to fix dmin
Turbo Encoder
X
RSC Input
Y1
Interleaver
Y2
Interleaver
The interleavers function is to permute low weight code words in one encoder into high weight code words for the other encoder. A row-column interleaver: data is written row-wise and read column wise. While very simple, it also provides little randomness. A helical interleaver: data is written rowwise and read diagonally. An odd-even interleaver: first, the bits are left uninterleaved and encoded, but only the oddpositioned coded bits are stored. Then, the bits are scrambled and encoded, but now only the even-positioned coded bits are stored. Odd-even encoders can be used, when the second encoder produces one output bit per one input bit.
INPUT X1 X6 X2 X7 X3 X8 X4 X9 X5 X10
X11
X12
X13
X14
X15
Turbo Decoding
Criterion
For n probabilistic processors working together to estimate common symbols, all of them should agree on the symbols with the probabilities as a single decoder could do
Turbo Decoder
Turbo Decoder
The inputs to the decoders are the Log likelihood ratio (LLR) for the individual symbol d. LLR value for the symbol d is defined ( Berrou) as
Turbo Decoder
The SISO decoder reevaluates the LLR utilizing the The value z is the extrinsic value local Y1 and Y2 determined by the same decoder if d is 0 redundancies to and it is negative if d is 1 and it is positive improve the The updated LLR is fed into the other decoder and which calculates confidence
the z and updates the LLR for several iterations After several iterations , both decoders converge to a value for that symbol.
Turbo Decoding
Assume
Ui : modulating bit {0,1} Yi : received bit, output of a correlator. Can take any value (soft). Turbo Decoder input is the log likelihood ratio
For each data bit, calculate the LLR given that a sequence of bit were sent
R(ui) = log [ P(Yi|Ui=1)/(P(Yi|Ui=0)] For BPSK, R(ui) =2 Yi/ (var)2
Turbo Decoding
Compare the LLR output, to see if the estimate is towards 0 or 1 then take HD
At the heart of the decoder Represent all possible states of an encoder (trellis) Number of states at a particular clock is 2n ; n = # of flip flops used in the SR Trellis shows:
SISO
Label all branches with a branch metric Function of processor inputs Obtain the LLR for each data bit by traversing the Trellis Two algorithms :
Log MAP
Turbo codes achieved the theorical limits with small gap Give rise to new codes : Low Density Parity Check (LDPC) Need Improvements in decoding delay
Reference
[1] University of South Australia, Institute for Telecommunications Research,Turbo coding research group. http://www.itr.unisa.edu.au/~steven/turbo /. [2] S.A. Barbulescu and S.S. Pietrobon. Turbo codes: A tutorial on a new class of powerful error correction coding schemes. Part I: Code structures and interleaverdesign. J. Elec. and Electron.Eng., Australia, 19:129142, September 1999.