Sample Question Paper 1

Download as pdf or txt
Download as pdf or txt
You are on page 1of 2

EE 532/Mid-sem (35% Weightage)/September 17, 2021/ Marks:70 Time:2Hrs

☞ Some questions comprise of sub-questions and please answer them together.

☞ Please index your answer script and fill the table provided in its front page.

j
1. Let X i = (X i , . . . , X j ).
¡ n n¢ P n
(a) i. Show that I X 1 , Y1 Ê I (X i ; Yi ) when X 1 , X 2 , . . . , X n are mutually indepen- 2
i =1
dent.
¡ n n¢ n
P
ii. Show that I X 1 , Y1 É I (X i ; Yi ) when given Yi , X i becomes conditionally 3
i =1
independent with remaining X 1 , . . . , X n ..
n
P ³ n i −1
´ P
n ³ i −1 n
´
(b) Show that I X i ; Yi +1 |X 1 = I Yi ; X 1 |Yi +1 . 5
i =1 i =1

2. Albert and Bernard just became friends with Cheryl, and they want to know when her
birthday is. Cheryl gives them a list of 10 possible dates for her birthday: May 15, 16,
19; June 17, 18; July 14, 16; and August 14, 15, 17. Cheryl then tells only to Albert the
month of her birthday, and tells only to Bernard the day of her birthday. (And Albert
and Bernard are aware that she did so.) Albert and Bernard now have the following
conversation:

C 1 → Albert: "I don’t know when Cheryl’s birthday is, but I know that you don’t know
either."
C 2 → Bernard: "At first I didn’t know when Cheryl’s birthday is, but now I know."
C 3 → Albert: "Now I also know when Cheryl’s birthday is."
(a) What is the entropy on the outset of the conversation. 3
(b) What is the entropy after the conversation C 1 . 3
(c) What is the entropy after the conversation C 2 . 3
(d) What is the entropy after the conversation C 3 . 3
(e) Show that Cheryl’s birthday is on July 16th. 3
 
1 − λ1 λ1 0
3. Consider a channel transition matrix PY|X =  0 1 − λ2 λ2 
λ3 0 1 − λ3
(a) Find the channel capacity when λ1 = λ2 = λ3 = 0.1. 3
(b) Find the channel capacity when λ1 = 0.1, λ2 = 0.2, λ3 = 0.3. 3

4. Let X , Y1 and Y2 be binary RVs. It is given that I (X ; Y1 ) = 0 and I (X ; Y2 ) = 0. Verify 5


whether I (X ; Y1 , Y2 ) = 0?
Please turn over ,→
5. Let X 1 .X 2 , . . . X 100 be i.i.d. binary RVs passed through a BSC with ¡ n cross over probability 5
n n¢
λ = 0.1 to get Y1 .Y2 , . . . Y100 . Let A ǫ be ¡the joint typical set of X , Y . Suppose for two
n n n n n¢
e e e e
independent sequences X and Y P ( X , Y ) ∈ A ǫ É 0.001. Find the lower bound on
n
|A ǫ |.
© ª
6. A source S has 4 symbols with probability mass function(PMF) 31 , 13 , 29 , 91 .
(a) Find a binary Huffman code C 1 for S. 3
(b) Is it possible to find another Huffman code C 2 different from C 1 for S. Suppose it is 3
possible then propose C 2 . Otherwise state why it is not possible.
(c) Does there exist any non-Huffman code C 3 with the same average length as that of 2
C 1 for S?
(d) Find a ternary Huffman code C 4 for S. 3
(e) Find a Shannon-Fano-Elias code C 5 for S. 3

7. A uniformly distributed source with 100 symbols have to be represented with binary 5
alphabets. Let l i be the length of the binary string for the i th symbol. Propose an in-
100
P −li
stantaneous coding technique so that the value of 2 can be maximised.
i =1
© ª
8. Let X be an RV takes the value from a set {a, b, c} with PMF 41 , 12 , 14 . Y is a uniformly 5
distributed RV takes the value from {0, 1, 2}. Represent X using Y .

9. Consider a discrete memoryless channel(DMC) Y = X Z , where X and Z independent 5


binary RVs that takes the values 1 and −1 with P (Z = 1) = α. Find the capacity of this
channel maximising the value of I (X ; Y ).

bq V❃✼❀❁✱✲✹✽XSQc

You might also like