Inner and Outer Decoding Performance of Convolutional Codes: Melinda

Download as pdf or txt
Download as pdf or txt
You are on page 1of 6

Proceeding of the 2011 IEEE International Conference on Space Science and Communication (IconSpace)

12-13 July 2011, Penang, Malaysia

Inner and Outer Decoding Performance of


Convolutional Codes
Melinda
Electrical Engineering, University of Syiah Kuala
Banda Aceh, Indonesia
[email protected]
Abstract Convolutional codes are used extensively in numerous
applications in order to achieve reliable data transfer. In this
paper, the parameters of Convolutional codes are introduced and
used to investigate its applications, characteristics and
performance when employed as inner and outer code in decoding
method. To evaluate the parameters, four schemes of
Convolutional codes: non-systematic-recursive, systematicrecursive, non-systematic-non-recursive and systematic-nonrecursive; are applied. In addition, the implementation of BCJR
(Bahl, Cocke, Jelinek and Raviv) algorithm is discussed by
applying APP-SISO (a posteriori probabilities Soft Input Soft
Output) decoding for Convolutional codes. The proposed scheme
employing inner and outer decoding of Convolutional codes are
analysed and compared by using EXIT characteristic of
Convolutional decoders. The channel capacity measurements for
different rates inner Convolutional codes are described by using
BPSK modulation over AWGN channel. BER performances of
outer decoding of Convolutional codes are observed for different
schemes which are described the process of area beneath in the
inverted EXIT function. This is essential to obtain better
performance of BER plot indeed. The result shows that inner
decoding of Convolutional codes is the best performance for the
EXIT function, particularly for non-systematic-recursive.
However, outer decoding of this code has not good performance
due to having quite larger area beneath than inner code.

The starting point is that the invention of Convolutional


code which is presented by Elias in 1955 [5]. Based on his
research, the Convolutional coding has presented quite similar
value for average performance as randomly codes. Further
research expressed that Bahl, Cocke, Jelinek and Raviv [6] had
proposed an APP decoding algorithm of Convolutional codes
which is known as BCJR algorithm in 1974.
There is a new method to measure the convergence of
behaviour of iterative decoding, known as EXIT chart
(Extrinsic Information Transfer). It was introduced firstly by S.
ten Brink [7]. Generally, implementing of EXIT chart is useful
to calculate the iterative decoding convergence to performance
of ML (Maximum Likelihood) and also the performance of
channel capacity [8].
II.

BACKGROUND INFORMATION OF CONVOLUTIONAL


CODES

A. Convoluitional Codes
Theoretically, a Convolutional code can be classified as
systematic and non-systematic codes [9]. The systematic code
is defined that the encoded codewords consist of the original
inputs bits. While, the non-systematic code expresses reversely
as the systematic scheme.

Keywords-component; Convolutional codes, inner, outer ,EXIT


chart and BER

B. Encoder Representation of Convolutional code


1) Encoder Circuit of Convolutional code
I.

INTRODUCTION

Nowadays, the use of digital communication has been


highly demanding. Every where and every time people is
applying the advanced communication in order to ease their
difficulty. Extraordinary efforts have been done by many
scientists to achieve better performances of digital
communication. Channel coding plays essential roles in the
digital communication, which the throughput should below the
channel capacity [1]. The history of channel coding was started
in 1948 by Claude Shannon [2].
The research of channel coding had continued since 1960.
There was the development of algebraic block code which is
known as cyclic code. Bose-Chaudhuri-Hocquenghem (BCH)
and Reed-Solomon (RS) are cyclic codes which can be
optimally decoded by algebraic decoding algorithm [3], and
[4].

978-1-4577-0564-9/11/$26.00 2011 IEEE

Figure 1. Encoder circuit of Convolutional code [9].

2) State transition Diagram of Convolutional code


The state diagram of Convolutional code demonstrates the
process of encoder Convolutional code that providing all the 2N

159

state and also it shows all the state transition with the output
bits.

2) BCJR Process
First of all, the BCJR is begun by taking the a priori LLR
frame L a ( u ) = { L a ( u i ) }I . There is a gamma value (T) ,
i =1

which is implemented for specific transition T in the trellis, as


expressed by:

(T ) = P (T fr (T )).Pa ( u i = b T )

(4)

Where, Transition T is bT {0,1} for the bit uiT

Pa ( u i = b ) is a priori LLR L a ( u i ) , which is related to

equation (3)
In the forward recursion, alpha value ( ( S(i ,n) ) ) is introduced
by the state S(0,0) [10], which is set to specific by using a priori
LLRs:

Figure 2. State diagram of Convolutional code [9]

[S(i ,n) ] = Tto(( S

3) Trellis Diagram of Convolutional code


Trellis diagram is one of the possible ways to determine a
Convolutional code encoder. Obviously, the figure 3 represents
the systematic of CC code which has CC (2, 1, 2).

(
i ,n) )

(T ).[ fr(T )]

(5)

Moving onto the backward recursion [10], the process is


quite similar to forward recursion. However, state
known as beta value
state

(S

( 
i , n )

[ S ( i , n ) ] =

( S ( i , n ) )

S ( I ,0 )

is

[6], which is set to particular

. Therefore, the expression of betas value is:

T f r (( S ( i , n ) )

( T ) . [ to ( T ) ]

(6)

The a posteriori probability Pp(T) can be calculated for


each transition T in the trellis as:
Pp (T ) =
Figure 3. Trellis Diagram of Convolutional code

(7)

In the last step process of BCJR algorithm, the output of a


posteriori LLRs are obtained, which is shown by the a
posteriori probability as follow:

C. Overview of decoding method


There are two common decoding methods which are usually
implemented in the decoder code, namely: Viterbi algorithm
[5] and BCJR algorithm [10].

Pp (ui = b ) =

1) BCJR Algorithm
BCJR stands for Bahl, Cocke, Jelinek and Raviv which was the
name of its inventors. It was known as MAP (Maximum A
posteriori) algorithm as well that was discovered by Bahl,
Cocke, Jelinek and raviv in 1974.
P ( b it = 0 )
L ( b its ) = ln

P ( b it = 1)

1
.[ fr (T ))]. (T ). [t 0(T )]
C1

iT =i
bT =b

Pp (T )

(3)

Where: L(bits) = represent the LLRs, P(bit =0) & Pbit


( =1) =
express the probability bits which has the value logic 1 (one) or
0 (zero).

Figure 4. Transition Trellis of BCJR algorithms

160

(8)

D. Iterative decoding convergence

1) Inner scheme of Convolutional codes

1) EXIT Analysis
The Extrinsic Information Transfer (EXIT) chart was
introduced firstly by Stephan ten Brink [11]. It is applied to
analyse the behaviour of iterative, which is known as Turbo
technique.
2) EXIT Function
By implementing the BCJR algorithm, there will be some a
priori information and a posteriori information which can be
applied here. Basically, the extrinsic information generation
structure is started by encountering an a priori information in
to BCJR decoder, while an a posteriori goes directly to a
modulo-2 adder. At the end of the structure, extrinsic
information is obtained by subtracting a posteriori information
with a priori information. This process can be illustrated by:

Figure 7. shows the scheme of inner decoding for Convolutional code.

2) Outer Scheme of Convolutional codes

Extrinsic_inf ormation=a_posteriori_inf ormationa_priori_inf ormation

Figure 5. Structure of Extrinsic Information Generation

3) The EXIT Chart


Figure 8. Outer decoding scheme of Convolutional code

The EXIT chart is used to visualize the characteristic of


iterative exchange of extrinsic information between serially
concatenated decoders [12]. In the chart, the mutual
information the decoder number one versus the mutual
information number two is plotted in the figure. In other words,
the horizontal axis of the EXIT chart displays a priori mutual
information, while extrinsic mutual information is illustrated
by vertical axis. The extrinsic mutual information I(be;b)
[0,1] can be calculated from the output extrinsic LLRs
sequence. Here, an a priori LLRS sequence which has a mutual
information I(ba;b) become an input to the decoder.
III.

B. AREA PROPERTIES OF EXIT FUNCTION


In order to accommodate an open tunnel for the trajectory plot
for BCHouter and CCs inner, the area of Ainner beneath the
inner EXIT function has to be larger than that of Aouter.

Aouter Ainner
Here,

Rinner x log 2 (m)

(9)
should be multiplied by both of side,

and then it can be given by:

DECODING PROCESS

Aouter xRinner xNt x log2 (m) Ainner xRinner xNt x log2 (m)

(10)

A. Convolutional codes encoder


Where: Rinner is the coding rate of inner decoder
R o u te r

Figure 6. Convolutional codes encoder

expresses the coding rate of outer decoder

shows the number of transmit antenna


is m-ary modulation which is applied.

Ais equal to the area beneath inner EXIT function


The Channel capacity determining by Shannon is known as
CCMC (Continuous- Input Continuous-Output Memory less
Channels) [13]. The CCMC capacity only depends on the
channel [14].

161

In contrast, in the design of the channel coded modulation,


the DCMC (Discrete-Input Continuous-Output Memoryless
Channel) is introduced [13]. The DCMC relies on the channel
and modulation types when it is implemented in the system
[14].
IV.

SIMULATION RESULT ANALYSIS AND DISCUSSION

A. Introduction of Parameters
The parameters of Convolutional codes which are used in
this work will be shown in the table below and also the four
types classification of Convolutional codes will be figure out
in the figure below as well.
Table 1. Parameters of Convolutional codes used in the simulation
No.

1
2
3
4

Figure 11. Convolutional code rate Non-Systematic & Non-Recursive

Types of Convolutionalcodes

CC1 (rate 1) Non-systematic


& Recursive
CC2 (rate1/2) Systematic &
Recursive
CC3 (rate ) Non-systematic
& Non-recursive
CC4 (rate ) Systematic &
Non-recursie

Figure 12. Convolutional code rate Systematic & Non-Recursive

B. EXIT function of Outer decoder


1) Convolutioanal Codes Outer Decoding
Convolutional codes also has a priori mutual information as
y axis I (ba;b) and x axis is denoted as extrinsic mutual
information I(be;b). There are striking views that can be seen in
the figure 13, the EXIT function of outer CC1 code illustrates
the straight line while the others CC2, CC3 and CC4 display
differently. Thus, the area beneath EXIT function also
different, outer CC1 has 1 value of coding rate, which is the
same for the area beneath the EXIT chart. However, the area
for CC2, CC3 and CC4 express quite similar value
approximately 0.5 of area beneath of the EXIT chart for each
of them.

Figure 9. Convolutional code rate 1 Non-systematic & Recursive

Figure 10. Convolutional code rate Systematic & Recursive

162

C. EXIT Function of Inner Decoding


1) Inner Decoding of Convolutional codes
As demonstrated in figure 15, it illustrates the EXIT function
for Inner decoder of four types Convolutional codes (based on
parameters table 1). There are only two types of CC that can
achieved (1, 1) point of the EXIT chart, namely CC1 and CC2,
while CC3 and CC4 can not reach that points. This is because,
CC1 and CC2 is a kind of recursive code [15] and [16], which
has low probability error that can be applied for iterative
decoding convergence.

Figure 13. The inverted EXIT function for three type of Convolutional code

2) BER of Convolutional codes


As observed in the figure 14, for four types of CC BER
performance, they begin from the same point of y axis of BER,
after that they are going to finish at completely different points
to each other. CC2 and CC4 express quite similar BER
performance. They start and finish at quite similar point of y
axis and x axis of the graph. While, CC1 display a straight line,
CC3 illustrates the straight line as well until 0.8 of x axis. From
this point on ward, the line decrease sharply and it does not end
up to 1. This is because CC3 is a kind of non-systematic or
non-recursive of Convolutional code. After discussion of the
BER performance of different Convolutional codes, it is
reasonable to say that CC2 and CC4 provide a better
performance of BER plot for Convolutional code indeed.

Figure 15. The EXIT function for four types parameters of Convolutional
code

D. Channel capacity of Convolutional codes


As determined in equation (9), the threshold SNR value
is obtained by using the simulation as given above. Where the
throughput is:

= Router xRinner xN t x log 2 (m)


.

(11)

Convolutional code also has quite similar way in the


measurement. However, as aforementioned in the inner
decoding part, the EXIT function of inner Convolutional code
is very good result for an inner EXIT function. This is because,
CC1, as pink line (with * marker), has the same line as BPSK
modulation or the DCMC capacity which is expressed by black
line (with square marker). Furthermore, the inner code of CC1
outstrips the other Convolutional codes such as CC2, CC3 and
CC4, respectively. The gap between CC1 and the other CC is
0.5 bits channel per use.
Figure 14. BER vs Ia for four types of Convolutional codes

163

REFERENCES
[1]
[2]
[3]

[4]
[5]
[6]

[7]
[8]

[9]
Figure 16. the channel capacity measurement of Inner Convolutional code
using BPSK modulation over AWGN channel

[10]

[11]

V.

CONCLUSION

[12]

In this paper, there are four types of Convolutional codes,


known as CC1, CC2, CC3 and CC4, which are applied as inner
and outer decoding codes. After simulation decoding process, it
is noticeable to say that inner code performance of decoding
process in the EXIT chart has been properly expressed by CC1
(rate 1), which is kind of recursive and non-systematic code.
This is because the EXIT function of inner decoding for
Convolutional code has quite the same performance as the
BPSK modulation scheme in the channel capacity
measurement. However, outer decoding of this code has not
good performance due to having quite larger area beneath than
inner code. In addition, in the BER performance, CC2 and CC4
provide a better performance of BER plot for Convolutional
code which express quite similar BER performance.

[13]
[14]
[15]

[16]

164

M. Alard, R lassalle,Principle of modulation and channel coding for


digital broadcasting for mobile receivers, EBU Review, 1987.
C. E Shannon,A mathematical theory of communication, Bell Syst.
Tech. J., Vol. 27, pp. 379-423 and 623-656, 1848.
A. Hocquenghem, Codes correcteurs derreurs, Chiffres, vol.2, pp.
147-156, 1959, R. C . Bose and D.K. ray-Chaudhuri,on a class of error
correcting binary group codes, Inform Contr., vol.3, pp.68-79, Mar
1960.
I. S, reed and J. Solomon, Polynomial codes over certain finite fields,
J. SIAM vol. 8, pp. 300-304, June 1960.
P. Elias,Coding for noisy channel, IRE Conv. Rec., pt. 4, pp. 37-46,
Mar. 1955.
L. R. Bahl, J. Cocke, F. Jelinek, and J. Raviv, Optimal decoding of
linear codes for minimizing symbol error rate, IEEE Trans. Inform,
Theory, vol. IT-20, pp. 284-287, Mar. 1974.
S. ten Brink,Convergence of iterative decoding, Electron.Lett., vol 35,
no. 10, pp.806-808, May 1999.
A. Ashikmin, G. Kramer, and S. ten Brink,Code Rate and the Area
under Extrinsic Information transfer Curves, ISIT 2002, Lausanne,
Switzerland, June 30-july 5, 2002.
L.hanjo, T.H Liew, B.L Yeap, Turbo coding, Turbo equalisation and
Space-time Coding, John Wiley&Sons, pp.14, 2002.
A. J. Viterbi,Error bound for Convolutional code and an asymptotically
optimum decoding algorithm, IEEE Trans, Inform. Theory, vol. IT-13,
no. 4, pp. 260-269, Apr. 1967.
C.Berrou, A. Glavieux, and P. Thitimajshima, Near Shannon limit
error-correcting coding and decoding: Tirbo codes, in Proc 1993 Int.
Conf Commun. Gneva, Switzerland, pp.1064-1070, May 1993.
G. Battail,Pondration des symbols decodes par Ialgorithme de
Viterbi, Annales des Telecommunications, vol.42, pp.31-38, Jan-feb.
1987.
J. G. Proakis, Digital Communications. Mc-Graw Hill International
Editions, 3rd ed., 1995.
R. G. Maunder.,Irregular variable length coding, PhD Thesis,
University of Southampton, 2007.
J. Kliewer, A. Huebner, and D. J. Costello, On the achievable extrinsic
information of inner decoders in serial concatenation", in Proceedings of
the IEEE International Symposium on Information Theory, Seattle, WA,
USA, pp. 2680-2684, July 2006.
R. Thobaben, EXIT functions for randomly punctured systematic
codes, in Proceed- ings of the IEEE Information Theory Workshop,
Lake Tahoe, CA, USA, pp. 24-29, September 2007.

You might also like