Assignment 1

Download as pdf or txt
Download as pdf or txt
You are on page 1of 1

Information Theory Coding (EC 632)

ASSIGNMENT 1

Faculty: Shashi Kumar D Date: 20/11/2017

Instructions:
Students should not copy from peers
Students must submit assignment before the deadline
Submission is only through LMS

1. Define information. Explain the properties and units according to the procedures used in measuring the
information.
2. Consider a binary system emitting two symbols {0,1} with probabilities 0.6 and 0.4, respectively. Find the
information conveyed by symbol ‘0’ and symbol ‘1’.
3. Consider a source emitting two symbols and with the corresponding probabilities 0.7 and 0.3 respectively.
Find the self-information conveyed by the symbols in (a) Bits, (b) Deceits and (c) Nats.
4. Derive an expression for average information content (Entropy) of a zero memory source with suitable
assumptions.
5. A pair of dice is rolled simultaneously. The outcome of the first dice is considered to be ‘x’ and that of
second as ‘y’. Three events are defined as
P={(X,Y) such that(2X+Y) is exactly divisible by 3}
2 2
Q={(X,Y)such that(X +Y ) is an even number}
R={(X,Y)such that X=Y}
Which event conveys more information? Justify your answer with mathematical calculation.
6. A discrete memory less source emits five symbols in every 2ms. The symbol probabilities are {0.5, 0.25,
0.125, 0.0625, and 0.0625}. Find the a verage information rate of the source.
7. A discrete source emits one of the following five symbols in every 1s. The symbol probabilities are(0.3, 0.2,
0.1, 0.3, 0.1 ). Find the average information content of the source in nats/sym and deceits/sym.
8. The output of an information source contains 160 symbols, 128 of which occur with a probability of and
remaining with a probability of 1/64 each. Find the average information rate of the source if the source
emits 10000 sym/s.
9. Explain the properties of Entropy and deduce the equation that gi ves the upper bound of entropy.
10. The international Morse code uses a sequence of symbols of dots and dashes to transmit letters of English
alphabet. The dash is represented by a current pulse of duration 2ms and a dot by duration of 1ms. The
probability of dash is half as that of dot. Consider 1ms duration of gap is given in between the symbols.
Calculate
(a) Self-information of a dot and a dash.
(b) Average-information content of a dot and a dash.
(c) Average rate of information.

You might also like