Source Symbol P Binary Code Shannon-Fano: Example 1
Source Symbol P Binary Code Shannon-Fano: Example 1
Source Symbol P Binary Code Shannon-Fano: Example 1
The source of information A generates the symbols {A0, A1, A2, A3 and A4} with the
corresponding probabilities {0.4, 0.3, 0.15, 0.1 and 0.05}. Encoding the source symbols using
binary encoder and Shannon-Fano encoder gives:
Since we have 5 symbols (5<8=23), we need 3 bits at least to represent each symbol in binary
(fixed-length code). Hence the average length of the binary code is
This example demonstrates that the efficiency of the Shannon-Fano encoder is much higher
than that of the binary encoder.
Example 2:
The geometric source of information A generates the symbols {A0, A1, A2 and A3} with the
corresponding probabilities {0.4, 0.2, 0.2 and 0.2}. Encoding the source symbols using binary
encoder and Shannon-Fano encoder gives:
Since we have 4 symbols (4=22), we need 2 bits at least to represent each symbol in binary
(fixed-length code). Hence the average length of the binary code is
We can construct the code tree for Shannon-Fano code in several ways, depending on the
splitting of symbols. This problem appears when the two groups are not of equal probability.
Two possible codes are discussed. One possible code is