VTU Electronics and Communication Engineering (Semester 5)
Information Theory & Coding
December 2013
Total marks: --
Total time: --
INSTRUCTIONS
(1) Assume appropriate data and state your reasons
(2) Marks are given to the right of every question
(3) Draw neat diagrams wherever necessary


1 (a) Define: i) Unit of information, ii) Entropy, iii) Information rate
6 M
1 (b) The output of an information source consist of 128 symbols 16 of which occur with a probability of 1/32 and remaining occurs with a probability of 1/224. The source emits 1000 symbols/sec assuming that symbols are chosen independently. Find the average information rate of the source.
4 M
1 (c) Find G1 and G2 and verify that G1>G2>H(s).

10 M

2 (a) Show that H(X,Y)=H(X/Y)+H(Y).
4 M
2 (b) Apply Shanon encoding algorithm to the following message:
Symbol S1 S2 S3
Probabilities 0.5 0.3 0.2

i) Find the code efficiency and redundancy.
ii) If the same technique is applied to the second order extension of the source, how much will the redundancy be improved.
10 M
2 (c) A technique used in a source encoder is to arrange message in a order of decreasing probability, divide message into two almost equal groups. Message in 1st group are assigned zero. Message in 2nd group are assigned with 1. Procedure is repeated till no further division is possible. Find code words for 6 message.
6 M

3 (a) State Shanon's Hartley law and its implications.
5 M
3 (b) Apply Huffman coding procedure for the following set of message and determine the efficiency of the binary code so formed symbols X1, X2, X3 with probabilities 07, 0.15, 0.15. If the same technique is applied to the 2nd order extension for the above message. How much will the efficiency be improved?
10 M
3 (c) For an AWGN channel with 4 KHz B.W and noise spectral density No/2=10-12 W/Hz. The signal power required at the receiver is 0.1 mW. Calculate the capacity of the channel.
5 M

4 (a) State the properties of mutual information.
4 M
4 (b) For the JPM given below, compute individually H(X), H(Y), H(X,Y), H(Y/X), H(X/Y) and I(X,Y). Verify the relationship among these entropies. \[ P(X,Y) = \begin{bmatrix} 0.05 &0 & 0.20 & 0.05\\0 &0.10 &0.10 & 0 \\0 &0 &0.20 & 0.10 \\ 0.05 & 0.05 & 0 & 0.10 \end{bmatrix} \]
10 M
4 (c) The noise characteristics shown in Fig, Q4(c), find channel capacity.

6 M

5 (a) What are the different methods of controlling errors? Explain.
6 M
5 (b) For a systematic (7,4) linear code, parity code is given by \[P=\begin{bmatrix} 1 &1 &1 \\1 &1 &0 \\1 &0 &1 \\0 &1 &1 \end{bmatrix} \] i) Find all possible valid code vectors.
ii) Draw the corresponding encoding circuit.
iii) A single error has occurred in each of these vectors detect and correct these errors:
YA=[0111110], Ya=[1011100], YC=[1010000]
iv) Draw the syndrome calculation circuit.
14 M

6 (a) What is binary cyclic code? Describe the features of encoder and decoder used for cyclic codes using an (n-k) bit shift register.
10 M
6 (b) Consider (15, 11) cyclic codes generated by g(x)=1+x+x4;
i) Device a feedback register encoder for this code.
ii) Illustrate the encoding procedure with the message vector 11001101011 by listing the states of the register.
10 M

Write short notes on:
7 (a) RS codes.
5 M
7 (b) Golay Codes
5 M
7 (c) Shortened cyclic codes.
5 M
7 (d) Brust-error correcting codes.
5 M

8 Consider (3,1,2) convolutional code with impulse response g(1)(2)=(101), g(3)=(111).
i) Draw the encoder block diagram.
ii) Find the generator matrix.
iii) Find the code vector corresponding to message sequence 11101 using time domain and transform domain approach.
20 M



More question papers from Information Theory & Coding
SPONSORED ADVERTISEMENTS