Lec.1n - COMM 552 Information Theory and Coding
Lec.1n - COMM 552 Information Theory and Coding
Lec.1n - COMM 552 Information Theory and Coding
Lecture – 1
Introduction
Source definition
Analog : Emit a continuous – amplitude, continuous – time
electrical wave from.
Discrete : Emit a sequence of letters of symbols.
Measure the information:
(1)
- When two independent messages are received, the total
information content is the Sum of the information conveyed by
each of the messages. Thus, we have
(2)
A source puts out one of five possible messages m1; m2; m3; m4;
m5; during each message interval. The probs. of these messages are
p1 = 1/2; p2 =1/4; p3 =1/8; p4=1/16; p5 =1/16;
Calculate the information content of these messages:
a. In bit units
b. In Hartley units
Solution 2:
𝟏
𝑰 𝒎𝒌 = 𝒍𝒐𝒈 = - 𝒍𝒐𝒈 𝒑𝒌
𝒑𝒌
Therefore:
s1 will occur Np1 times
s2 will occur Np2 times
::
si will occur Npi times
The information content of the ith symbol is:
𝟏
𝑰 𝒔𝒊 = 𝒍𝒐𝒈 bits
𝒑𝒊
𝑴 𝟏
𝑯= 𝒊=𝟏 pi 𝒍𝒐𝒈
𝒑𝒊
bits per symbol
Solution 3:
Source entropy
𝑴 𝟏
𝑯= 𝒊=𝟏 pi 𝒍𝒐𝒈
𝒑𝒊
bits per symbol
At p = 0, H = 0 and
At p = 1, H = 0
Entropy maximum value is for equi-
probable symbols, i.e. at p=1/2
1 1
𝐻 = 𝑙𝑜𝑔2 𝟐 + 𝑙𝑜𝑔2 𝟐 =1
2 2
Check of results with the max entropy definition:
𝐻𝑚𝑎𝑥 = 𝑙𝑜𝑔2 𝟐 =1
Information rate
If the source is emitting symbols at a fixed rate of rs symbols/sec,
the average source information rate R is defined as:
R = rs . H bits/sec
Example 4:
Consider a discrete memoryless source with a source alphabet:
A = { s0, s1, s2} with respective probs. p0 = ¼, p1 = ¼, p2 = ½. Find:
a. The entropy of the source.
b. The information rate if the source is emitting symbols at a fixed
rate of 1 symbols/sec
Solution 4:
𝑴 𝟏
𝑯= 𝒊=𝟏 pi 𝒍𝒐𝒈
𝒑𝒊
bits per symbol
R = rs . H bits/sec
Solution 4:
𝑴 𝟏
𝑯= 𝒊=𝟏 pi 𝒍𝒐𝒈
𝒑𝒊
bits per symbol
R = rs . H bits/sec
R = rs . H bits/sec
Solution 5.a:
𝑴 𝟏
𝑯= 𝒊=𝟏 pi 𝒍𝒐𝒈
𝒑𝒊
bits per symbol
R = rs . H bits/sec
R = rs . H bits/sec
R = rs . H = 2B × 2 = 2 × 5 × 2 = 20 bits/sec