Source Symbol P Binary Code Shannon-Fano: Example 1

Download as pdf or txt
Download as pdf or txt
You are on page 1of 2

Example 1:

The source of information A generates the symbols {A0, A1, A2, A3 and A4} with the
corresponding probabilities {0.4, 0.3, 0.15, 0.1 and 0.05}. Encoding the source symbols using
binary encoder and Shannon-Fano encoder gives:

Source Symbol Pi Binary Code Shannon-Fano


A0 0.4 000 0
A1 0.3 001 10
A2 0.15 010 110
A3 0.1 011 1110
A4 0.05 100 1111
Lavg H = 2.0087 3 2.05

The Entropy of the source is

Since we have 5 symbols (5<8=23), we need 3 bits at least to represent each symbol in binary
(fixed-length code). Hence the average length of the binary code is

Thus the efficiency of the binary code is

Shannon-Fano code is a top-down approach. Constructing the code tree, we get


The average length of the Shannon-Fano code is

Thus the efficiency of the Shannon-Fano code is

This example demonstrates that the efficiency of the Shannon-Fano encoder is much higher
than that of the binary encoder.

Example 2:
The geometric source of information A generates the symbols {A0, A1, A2 and A3} with the
corresponding probabilities {0.4, 0.2, 0.2 and 0.2}. Encoding the source symbols using binary
encoder and Shannon-Fano encoder gives:

Source Symbol Pi Binary Code Sh-F Code I Sh-F Code II


A0 0.4 00 0 00
A1 0.2 01 10 01
A2 0.2 10 110 10
A3 0.2 11 111 11
Lavg H = 1.922 2 2 2

The Entropy of the source is

Since we have 4 symbols (4=22), we need 2 bits at least to represent each symbol in binary
(fixed-length code). Hence the average length of the binary code is

Thus the efficiency of the binary code is

We can construct the code tree for Shannon-Fano code in several ways, depending on the
splitting of symbols. This problem appears when the two groups are not of equal probability.
Two possible codes are discussed. One possible code is

You might also like