VTU Electronics and Communication Engineering (Semester 5)
Information Theory & Coding
December 2015
Total marks: --
Total time: --
INSTRUCTIONS
(1) Assume appropriate data and state your reasons
(2) Marks are given to the right of every question
(3) Draw neat diagrams wherever necessary


1 (a) A binary source produces Symbols 0 and 1 with probability P and 1-P. Determine the entropy of this source and sketch the variation of the entropy with P.
5 M
1 (b) Prove that the information content of N independent message is additive.
5 M
1 (c) For mark off source shown. Find the source entropy and G1, G2 and G3.

10 M

2 (a) For the state diagram shown find
i) State probabilities
ii) Entropy of each state
iii) Entropy of the source.

10 M
2 (b) The joint probability matrix of a channel is given. Compute H(x), H(y), H(xy), H(x/y) and H(y/x). \[ p(xy) = \begin{pmatrix} 0.05 &0 &0.2 &0.05 \\0 &0.1 &0.1 &0 \\0 &0 &0.2 &0.1 \\0.05 &0.05 &0 &0.1 \end{pmatrix} \]
10 M

3 (a) Prove the identities:
i) H(x, y) = H(x) + H(y)
ii) H(xy) = H(x) + H(y/x).
8 M
3 (b) A source emits symbols with probabilities 0.4, 0.2, 0.12, 0.08, 0.08, 0.08, 0.04. Construct a binary Huffman code and Shannon-Fano code. Calculate efficiency in both cases.
12 M

4 (a) Derive the expression for channel capacity for the binary channel shown

8 M
4 (b) Define mutual information and explain its properties.
4 M
4 (c) An analog signal has a bandwidth of 4KHz. The signal is sampled at 2.5 times the Nyquist rate and each sample is quantized into 256 equally likely levels. Assume that the successive samples are statistically independent. i) Find the information rate of this source.
ii) Can the output of this source be transmitted without error over a channel of Bandwidth 50KHz and S/N=20db.
iii) If the output of this source is to be transmitted without errors over an analog channel having S/N=10. Compute the band width required.
8 M

5 (a) Define hamming weight, hamming distance and minimum distance of linear block code.
6 M
5 (b) For a linear block code the syndrome is give by
S1=r1+r2+r3+r5
S2=r1+r2+r4+r6
S3=r1+r3+r4+r7
i) Find the generator matrix
ii) Draw the encoder and decoder circuit
iii) How many errors can it detect and correct.
14 M

6 (a) A (7,4) binary cyclic code has a generator polynomial g(x)=1+x+x3
i) Write the syndrome circuit
ii) Verify the circuit for the message polynomial d(x)=1+x3, showing the contents of the resister for each state.
8 M
6 (b) A (15, 5) binary cyclic code has a generator polynomial g(x)= 1+x+x2+x4+x5+x8+x10
i) Draw the encoder block diagram
ii) Find the code polynomial for message polynomial d(x)=1+x2+x4 in systematic form
ii) Is V(x)=1+x4+x6+x8+x14 a code polynomial? If not, find the syndrome of V(x).
12 M

Explain:
7 (a) BCH code
7 M
7 (b) Golay code
7 M
7 (c) Reed Solomon codes
7 M

8 Consider the 3, 1, 2 convolution code with g(11) = 110, g(2)=101 and g(3) = 111
i) Draw the encoder block diagram
ii) Find the generator matrix
iii) Find the code word corresponding to the message sequence (11101) using both time domain and frequency domain approach.
20 M



More question papers from Information Theory & Coding
SPONSORED ADVERTISEMENTS