ECT305: Analog and Digital Communication Module 2, Part 3: DR - Susan Dominic Assistant Professor Dept. of ECE Rset
ECT305: Analog and Digital Communication Module 2, Part 3: DR - Susan Dominic Assistant Professor Dept. of ECE Rset
ECT305: Analog and Digital Communication Module 2, Part 3: DR - Susan Dominic Assistant Professor Dept. of ECE Rset
Communication
Module 2, Part 3
Dr.Susan Dominic
Assistant Professor
Dept. of ECE
RSET
• An information source generates a finite sequence of symbols or letters called a
message, denoted by 𝑠1 , 𝑠2 …. 𝑠𝑚 , where m is the total number of symbols in the
message
• The set of all possible symbols is called the source alphabet and is denoted by S = {𝑠1 ,
𝑠2 …. 𝑠𝑚 }
• It is assumed that that each symbol 𝑠𝑖 of the source alphabet S has a certain
probability 𝑝𝑖 of occurring, or being sent
• 𝑃(𝑠𝑖 ) = 𝑝𝑖 where 0 ≤ 𝑝𝑖 ≤ 1 for i= 1, 2, . . . , n and σ𝑖 𝑝𝑖 = 1
• These probabilities are assumed to remain constant over time, and hence we
say the source is stationary
• In addition, we assume that each symbol is sent independently of all previous
symbols, so we also say the source is memoryless
lim I A = 0
P A →1
1
𝐼 𝑠𝑘 = log 2
𝑃 𝑠𝑘
Consider a discrete memoryless source emitting one of K symbols from a finite source
alphabet 𝑆 = 𝑠1 , 𝑠2 , . . 𝑠𝐾
1. 𝑯 𝒔 = 𝟎, if and only if the probability 𝑝𝑘 = 1 for some 𝑘, and the remaining probabilities
in the set are all zero.
2. When 𝑝0 = 1, entropy 𝐻 𝑆 = 0
1
3. When 𝑝0 = 𝑝1 = , entropy 𝐻 𝑆 attains its maximum value, 𝐻𝑚𝑎𝑥 = 1 𝑏𝑖𝑡/𝑠𝑦𝑚𝑏𝑜𝑙
2
Entropy of Extended Source
If the source symbols are independent, the probability of a source symbol in 𝑆 𝑛 is the
equal to the product of the probabilities of the 𝑛 source symbols in 𝑆
Thus, the entropy of the extended source is 𝑛 times the entropy of the original source
Q 2.7)
/symbol
Consider a second order extension of source, the source alphabet of the extended source 𝑆 2
has nine symbols
The entropy of extended source is
/symbol
Thus,
Q 2.8)
/symbol
Differential Entropy
➢ We know for discrete messages,
𝑞 1
➢ 𝐻(𝑆) = σ𝑘=1 𝑝𝑘 𝑙𝑜𝑔2 ( ) bits/symbol
pk
➢ Consider a continuous random variable X with the probability density function 𝑓𝑋 (𝑥)
Consider a random variable X uniformly distributed over the interval (0, a). The
probability density function of X is
𝑏𝑖𝑡𝑠/𝑠𝑎𝑚𝑝𝑙𝑒
➢ log a < 0 for a < 1. Thus, this example shows that, unlike a discrete random variable,
the differential entropy of a continuous random variable can assume a negative value.
Example: Differential entropy of a Gaussian source