VTU Electronics and Communication Engineering (Semester 5)
Information Theory & Coding
June 2015
Total marks: --
Total time: --
INSTRUCTIONS
(1) Assume appropriate data and state your reasons
(2) Marks are given to the right of every question
(3) Draw neat diagrams wherever necessary


1 (a) Derive an expression for average information content (entropy) of long independent messages.
5 M
1 (b) Define information [I], average information rate, symbol rate and mutual information.
5 M
1 (c) For the Markov source model shown below compute initial probabilities state entropy source entropy and show that G1> G2> H(s).

10 M

2 (a) Explain Shannon's noiseless encoding algorithm.
4 M
2 (b) Using Shannon's binary encoding algorithm, find all the code words for the symbols given below also find its efficiency and redundancy. Given
S0 S1 S2 S3 S4
0.55 0.15 0.15 0.1 0.05
8 M
2 (c) State all the properties of entropy and prove the external property.
8 M

3 (a) For a channel whose matrix is as given below for which P(x1)=1/2; P(x2)=P(x3)=1/4 and r5=10,000 sym/sec. Find H(x),A(y),H(x,y),H(x/y),H(y/x),I(x,y). Also find information rate at transmitter (Rin)and information rate at receiver (R1) capacity, efficiency and redundancy.
\[P(y/x)=\begin{bmatrix} 0.8 &0.2 &0 \\0.1 &0.8 &0.1 \\0 &0.2 &0.8 \end{bmatrix}\]
10 M
3 (b) A source produces 9 symbols with probabilities {0.36, 0.24, 0.12, 0.08, 0.08, 0.07, 0.03, 0.02}
i) Construct Huffman binary code and determine its efficiency (η) and redundancy (R).
ii) Construct Huffman ternary code and find its efficiency (η) and redundancy (R).
10 M

4 (a) State and explain Shannon Hartley law. Derive an expression for the upper limit of the channel capacity.
7 M
4 (b) Define mutual information and explain all the properties of mutual information.
6 M
4 (c) Two noisy channels are cascaded whose channel matrices are given by
\[P(y/x)=\begin{bmatrix} 1/5 &1/5 &3/5 \\1/2 &1/3 &1/6 \end{bmatrix}\ P(z/y)=\begin{bmatrix} 0 &3/5 &2/5 \\1/3 &2/3 &0 \\1/2 &0 &1/2 \end{bmatrix}\ with P(x_{1})=P(x_{2})=i/2\] , find the overall mutual information I(x,z) and I(x,y).
7 M

5 (a) Draw the block diagram of a digital communication system and explain the function of each block.
6 M
5 (b) The parity check bits of a (7,4) Hamming codes are generated by
c5=d1+d3+d4
c6=d1+d2+d3
c7=d2+d3+d4
Where d, d2, d3 and d4 ate message bits.
i) Find generator matrix (G) and parity check matrix [H] for this code.
ii) Prove that GHT=0
iii) Find the minimum weight of this code.
iv) Find error detecting and correcting capacity.
v) Draw encoder circuit and syndrome circuit for the same.
12 M
5 (c) Compare fixed length code and variable length code.
2 M

6 (a) A (15,5) linear cyclic code has a generator polynomial g(x)=1+x+x2+x4+x5+x8+x10.
i) Draw the cyclic encoder and find code word for the message polynomial
D(x)=1+x2+x4 in systematic from by listing the states of the shift register.
ii) Draw the syndrome calculator circuit for given g(x).
12 M
6 (b) For the given generator polynomial find generator matrix and parity check matrix and find code word for (7,3) Hamming code and its hamming weight g(x)=1+x+x2+x4.
8 M

Write short notes on:
7 (a) BCH codes.
4 M
7 (b) Shortened cyclic code.
4 M
7 (c) RS code.
4 M
7 (d) Golay code.
4 M
7 (e) Burst error correcting code.
4 M

8 (a) Consider the convolutional encoder shown below
i) Draw the state diagram
ii) Draw code tree
iii) Find the code word for the message sequence 10111

10 M
8 (b) For a (2,1,2) convolutional encoder with generator sequence gt=111 and g(2)=101.
i) Draw convolutional encoder circuit.
ii) Find the code word for the message sequence 10111 using time domain approach and transfer domain approach.
10 M



More question papers from Information Theory & Coding
SPONSORED ADVERTISEMENTS