Information Theory and Coding Sample Question 2021
Information Theory and Coding Sample Question 2021
Information Theory and Coding Sample Question 2021
𝑇 0 ≤ 𝑡 ≤ 𝑇 and 0≤𝑡≤𝑇
𝑠2 (𝑡) = 𝑝 (𝑡 − ) 𝑠4 (𝑡) = − sin 𝜔0 𝑡
2
“Distance property of orthogonal signals is pretty good compared to antipodal signals”-
justify the statement considering the above signals.
4. Define cross-corelation coefficient of waveform coding. Write the cross-corelation
coefficient value of orthogonal code, biorthogonal code and transorthogonal code.
Error Control
5. Compute orthogonal, biorthogonal and transorthogonal codeword set for 2/3/4-bit data
set.
6. Mention the advantage of biorthogonal code over orthogonal code.
7. Construct Hadamard matrix for 3-bit data set.
Topic-3
1. Define even parity, odd parity, parity code and rectangular code.
Parity-Check
6. Draw a decoder circuit for (6, 3) block code for the following parity array:
1 1 0
𝑃 = [0 1 1]
1 0 1
7. Define standard array and syndrome look-up table of linear block code.
8. Consider a (7, 4) code whose parity array is
1 1 1
1 0 1
𝑃=[ ]
0 1 1
1 1 0
i. Find all the codewords of the code.
ii. Find parity-check matrix of the code.
iii. Compute the syndrome for the received vector 1 1 0 1 1 0 1. Is this a valid code
vactor?
9. Compute the syndrome for the received vector 0 0 1 1 1 0. If this is not a valid codeword
vector, then calculate the correct vector using following syndrome look up table. Assume
that the parity array of the encoder is P.
6. Use a feedback shift register, encode the message vector 𝑚 = 1011 into a (7, 4)
codeword using the generator polynomial𝑔(𝑋) = 1 + 𝑋 2 + 𝑋 3 .
7. If the received vector 𝟏 𝟎 𝟎 𝟏 𝟎 𝟏 𝟏, then calculate the syndrome with (n-k)-stage shift
register using the generator polynomial g(X) = 1 + X + X 3. Has error in received
vector?
8. What is syndrom? Design a syndrom calculation circuit for (7,4) cyclic code using a
generator polynimial g(X) = X 3 + X + 1. If the received codeword is 𝟏 𝟎 𝟏 𝟎 𝟎 𝟏 𝟏,
determine whether received codeword is corrupted or not?
9. Draw a (𝑛, 𝑘) cyclic encoder. Describe the encoding procedure of cyclic encoder.
10 Design a dividing circuit to divide 𝑉(𝑋) = 𝑋 3 + 𝑋 5 + 𝑋 6 by g(𝑋) = 1 + 𝑋 + 𝑋 3 .
Find the remainder and quotient terms. Compare the remainder and quotient of the
dividing circuit to division performed by polynomial division.
Topic-6
1. Draw the state diagram, tree diagram and trellis diagram for the K = 3, rate 1/3 code
generated by
𝑔1 (𝑋) = 𝑋 + 𝑋 2
𝑔2 (𝑋) = 1 + 𝑋
𝑔3 (𝑋) = 1 + 𝑋 + 𝑋 2
2. What is flushing bit of convolutional code? Why it is required?
3. Mention the methods are used for representing a convolutional encoder.
4. Draw the state diagram, tree diagram and trellis diagram for the convolutional encoder
Convolutional Code
4. Define bits, bit rate, symbol, uncertainty, information, message, word, baud, alphabet
and data.
5. Define discrete memoryless source and memoryless source with memory.
6. Determine the amount of information of a symbol.
7. Discuss the following conditions in terms of probability.
i. 𝐼(𝑆𝐾 ) = 0
ii. 𝐼(𝑆𝐾 ) ≥ 0
iii. 𝐼(𝑆𝐾 ) > 𝐼(𝑆𝐾 )
iv. 𝐼(𝑆𝐾 𝑆𝐼 ) = 𝐼(𝑆𝐾 ) + 𝐼(𝑆𝐼 )
v. 𝐻(𝜉) = log 2 𝐾
8. Define entropy. State and prove the property of entropy.
9. Derive the entropy function of binary memoryless source.
10. Define block of discrete memoryless source. Write the relation between extended
source and original source in terms of entropy.
11. Define second-order extended source and third-order extended source.
12. Consider a discrete memoryless source with source alphabet 𝜉 = {𝑠0 , 𝑠1 , 𝑠2 } with
respective probabilities 𝑝0 = 0.25, 𝑝1 = 0.25 and 𝑝2 = 0.5. Prove the relationship
between entropy of an ordinary source and entropy of an extended source for the
source by assuming the source is second-order.
14. Define conditional entropy and redundancy.
15. Find the entropy, redundancy and information rate of a five symbol source (𝐴, 𝐵, 𝐶, 𝐷, 𝐸)
with a baud rate of 1024 symbols/second and symbol selection probabilities of 0.5, 0.2,
0.15, 0.1, 0.05 under the following conditions:
i. The source is memoryless.
ii. The source has a one symbol memory such that no two consecutively selected
symbols can be the same.
16. Define noiseless and noisy channel.
17. Define unconditional entropy, conditional entropy, effective entropy, equivocation,
transition matrix.
18. “An alternative interpretation of equivocation is as negative information added by
noise” Justify your statement by defining each part of effective entropy.
19. Consider a three symbol source (𝐴, 𝐵, 𝐶) with the following transition matrix
𝐴 𝑇𝑋 𝐵𝑇𝑋 𝐶𝑇𝑋
𝐴𝑅𝑋 0.6 0.5 0
𝐵𝑅𝑋 0.2 0.5 0.333
𝐶𝑅𝑋 0.2 0 0.667
For the specific a prior probabilities 𝑃(𝐴) = 0.5, 𝑃(𝐵) = 0.2, 𝑃(𝐶) = 0.3. Find (i) the
unconditional symbol reception probabilities 𝑃(𝑖𝑅𝑋 ) from the conditional and a prior
probabilities, (ii) the equivocation, (iii) the source entropy and (iv) the effective
entropy.
Topic-8
1. Define source coding, maximum entropy, code efficiency, source efficiency.
2. Write the functional requirements of efficient source encoder.
3. Mention the drawbacks of Huffman code and where is the Lempel-Ziv code suitable to
use?
4. Write the advantages of Lempel-Ziv code over prefix code and Huffman code.
Source Coding
i. Identify which one is the prefix code and construct its decision tree.
ii. “Meeting Kraft-McMillan Inequality does not give the guarantee of being a
prefix code”-justify the statement considering all three codes.
6. Show that the minimum variance of Huffman code is obtained by moving the probability
of a combined symbol as high as possible for a source having five symbols whose
probabilities of occurrence are given.
Symbol Probability
𝑆0 0.4
𝑆1 0.2
𝑆2 0.2
𝑆3 0.1
𝑆4 0.1
7. Determine the codeword using Lempel-Ziv coding algorithm for the following input
binary sequence.
00010111 0010100101
8. Determine the prefix code and Huffman code for the following source.
Symbol Probability
𝑆0 0.4
𝑆1 0.2
𝑆2 0.2
𝑆3 0.1
𝑆4 0.60
𝑆5 0.04
Topic-9
1. Define discrete memoryless channel.
Discrete Memoryless Channel