VTU Electronics and Communication Engineering (Semester 5)
Information Theory & Coding
December 2014
Total marks: --
Total time: --
INSTRUCTIONS
(1) Assume appropriate data and state your reasons
(2) Marks are given to the right of every question
(3) Draw neat diagrams wherever necessary


1(a) For the given first order Markov source in Fig. Q1 (a) shown find i) state probabilities ii) entropy of each state iii) Entropy of the souece iv) find G1.G2

10 M
1(b) A black and white TV picture of 526 lines of picture information. Assume that each line consists of 526 picture elements (pixels) and that each can have 255 brightness levels. Picture is repeated at the rate of 30 frames / sec. Calculate the average rate of information conveyed by a TV set to a viewer.
4 M
1(c) Define i) Self information ii) Entropy
iii) Rate of information iv) mutual information.
6 M

2(a) A BSC channel has the following noise matrix with source probabilities:
\[p(x_{1})=\frac{2}{3}\ and P(x_{2})=\frac{1}{3}\\ p\left ( \frac{x}{y} \right )\begin{bmatrix} \frac{3}{2}& \frac{1}{4}\\\frac{1}{4} &\frac{3}{4} \end{bmatrix}\\ Determine : i) H(x),H(y),H(x,y)H\left \left ( \frac{y}{x} \right ),H( \frac{x}{y} \right )\\ and I (x,y)\]
ii) Channel capacity C.
iii) Channel efficiency and redundancy.
10 M
2(b) Show that \[H(x,y)= H\left ( \frac{x}{y} \right )+H(Y)\].
4 M
2(c) For the given channel matrix.Calculate. H(x),H(Y) and channel capacity P(x1)- 0.6.P(x2)=0.3 and p(x3)-0.1
\[P\left [ \frac{y}{x} \right ]\begin{bmatrix} \frac{1}{2} &\frac{1}{2} &0 \\\frac{1}{2} & 0 &\frac{1}{2} \\ 0 &\frac{1}{2} &\frac{1}{2} \end{bmatrix}\]
6 M

3(a) Explain the properties of mutual information and also prove that mutual information is non negative.
6 M
3(b) For an AGWN channel with 4kHz BW and noise spectral density \[y\frac{N_{0}}{2}=10^{-12}\]W/Hz. The signal power required at the receiver is 0.1 mW.Calculate the capacity of the channel.
4 M
3(c) Given the source ;{S1,S2,S,S4,S5,S6,S7}WITH probabilities,
p ={0.1,0.2,0.1,0.4,0.1,0.05,0.05} respectively.
find:
i) H(s) and H(S3)
ii) Find a compact Huffman binary code by placing composite symbol a low as possible.
iii) Find a compact Huffman binary code by placing composite symbol as high as possible.
iv) Find yhe average length, efficiency, redundancy, decision tree diagram for both the above codes.
10 M

4(a) Explain Shannon Hartley law on channel capacity without proof.
5 M
4(b) Find the mutual information and the channel capacity of the channel shown in Fig.Q4 (b).
P(x1)=0.6 P(x2)-0.4

10 M
4(c) A Gaussian channel has 10 Mhz BW if\[\left ( \frac{S}{N} \right )\] ratio is 100.Calculate channel capacity and maximum information raet.
5 M

5(a) For a systematic (6,3) linear block code the parity matrix,
\[P=\begin{matrix} 1 &0 &1 \\0 &1 &1 \\1 &1 &1 \end{matrix}\]
i) find all possible code vectors.
ii) find the minimum weight of the code.
iii) find the parity check matrix.
iv) For and correct code vector R = 110010
10 M
5(b) Define the ; i) Burst error ii) systematic linear block code
iii) Galois field iv) Hamming weight
4 M
5(c) What are different methods of controlling errors? Explain.
6 M

6(a) For a [7,4] single error correcting code D(X) = d0+d1X+d2X2+d3X3 and Xn + 1 = X7+1+(1+X+X3)(1+X+X2-X4) using generator polynomial g(X) =(1+X+X3). Find all 16 code vector of cyclic code both in non systematic and systematic form.
10 M
6(b) what is binary cyclic code? Describe the features of encoder used for cyclic codes using an (h-k) shift register.
10 M

7(a) Determine the parameters of q-ary RS code over GF (256) for dmin=33.
5 M
7(b) consider a (15,9) Cyclic code generated by 1+x3+x4+x5+x6. This code has burst error correcting ability b=3. Find the burst error correcting efficiency of this code.
5 M
7(c) Write short note on :
i) Golay codes.
ii) RS codes.
10 M

8(a) For the convolution encoder,g(1)=111,g(2)=101
i) Draw the encoder block diagram.
ii) Find generator matrix.
iii) Find code word corresponding to information sequence 10011 using time domain and transform domain.
10 M
8(c) write a note on trellis diagram.
4 M



More question papers from Information Theory & Coding
SPONSORED ADVERTISEMENTS