DC Mid Imp
DC Mid Imp
DC Mid Imp
1. Derive the probability of error expression for the following keying techniques:
a. Amplitude Shift Keying
b. Frequency Shift Keying
c. Phase Shift Keying
d. Quadrature Phase Shift Keying
3. A DMS transmits four messages 𝑀1,𝑀2, 𝑀3, 𝑀4 with the probabilities ½, ¼, 1/8, 1/8 respectively
Calculate the Entropy H. If r = 1 message/sec then calculate the R.
4. i) Show that the entropy for a discrete memory less source is maximum at log2M when the output
symbols are equally probable. ii) Show that H(X, Y) = H(X) + H(Y |X) = H(Y) + H (X|Y). iii)
Show that I(X, Y) = H(X) + H(Y) - H(X, Y) = H(X) - H (X|Y) = H(Y) - H(Y |X).
6. Consider a telegraph source having two symbols, dot and dash. The dot duration is 0.2 s. The dash
duration is 3 times that of the dot duration. The probability of the dot’s occurring is twice that of
the dash, and the time between symbols is 0.2 s. Calculate the information rate of the telegraph
source.
7. An analog signal is band limited to B Hz and sampled at Nyquist rate. The samples are quantized
in to 4 levels. Each level represents one message. Thus there are 4 messages the probabilities of
occurrence of these 4 levels (message) are p1 = p4 = 1/8 and p2 = p3 = 3/8. Find out information
rate of the source.
8. Apply Shannon-Fano coding to the source with 8 emitting messages having probabilities
½,3/20,3/20,2/25,2/25,1/50,1/100 and 1/100 respectively, and find the coding efficiency.
9. For the following messages with their probabilities calculate the code words using both
“Shannon- Fano Coding” and “Huffman Encoding” technique and also calculate the coding
efficiency
M1 M2 M3 M4 M5 M6 M7 M8 M9
1/2 1/8 1/8 1/8 1/16 1/16 1/16 1/32 1/32
11. For a (7, 4) Binary Cyclic Codes, the Generator polynomial 𝑔(𝑥) = 1 + 𝑥 + 𝑥3 find out all code
vectors in “Systematic” Procedure.
12. Draw the state diagram, tree diagram and trellis diagram for k=3 and rate 1/3 code generated by
g1(x) = 1+x2, g2(x) =1+x, g3(x) =1+x+x2.