VTU Electronics and Communication Engineering (Semester 5)
Information Theory & Coding
May 2016
Total marks: --
Total time: --
INSTRUCTIONS
(1) Assume appropriate data and state your reasons
(2) Marks are given to the right of every question
(3) Draw neat diagrams wherever necessary


1(a) Define self information, entropy of the long independent messages, information rate, symbol rate and mutual information.
5 M
1(a) Describe the parameter characteristics of the discrete information source.
5 M
1(b) The output of an information source consists of 128 symbols, 16 of which occur with a probability of \( \dfrac{1}{32} \) and the remaining occur with a probability of \( \dfrac{1}{224} \). The source emits 1000 symbols per second. Assuming that the symbols are chosen independently, find the average information rate of this source.
5 M
1(b) For a source emitting symbols in independent sequences, show that the source entropy is maximum when the symbols occur with equal probability.
5 M
1(c) For the Markov source model shown in Fig. Q1(c):
i) Compute the state probabilities.
ii) Compute the entropy of each state.
iii) Compute the entropy of the source.
:!mage
10 M
1(c) The state diagram of a stationary Markoff source is shown in Fig. Q1 (c).
i) Find the entropy of each state H
ii) Find the entropy of the source H.
iii) Find G1 and G2 and verify G1 ≥ G2 ≥ H.
:!mage
10 M

2(a) State the properties of entropy.
4 M
2(a) State and prove Kraft's inequality theorem.
5 M
2(b) A source emits one of the 5 symbols A, B, C, D & E with probabilities \( \dfrac{1}{4},\dfrac{1}{3},\dfrac{1}{8},\dfrac{1}{16}\) and \( \dfrac{5}{16}\) respectively in an independent sequence of symbols. Using Shannon's binary encoding algorithm, find all the code words for the each symbol. Also find coding efficiency and redundancy.
8 M
2(b) Given , source alphabet S = {S1, S2} and \( P=\left \{ \dfrac{3}{4},\dfrac{1}{4} \right \}\), obtain Shannon fano codes for the basic source 'S' and for its 2ns and 3rd extensions. Consider above source as zero memory source and comment on the efficiency and redundancy of each case.
10 M
2(c) Construct a Shannon-Fano ternary code for the following ensemble and find code efficiency and redundancy. Also draw the corresponding code-tree.
S = {S1, S2, S3, S4, S5, S6, S7}; P = {0.3, 0.3, 0.12, 0.12, 0.06, 0.06, 0.04} with X = {0, 1, 2}
8 M
2(c) What is a discrete communication channel? Illustrate the model of a discrete communication channel.
5 M

3(a) Show that \( H(X,Y)=H(Y)+H\left ( \dfrac{X}{Y} \right ).\)
5 M
3(a) A source alphabet is having five source symbols with probability of occurrence {0.4, 0.2, 0.2, 0.1, 0.1}. Construct binary Huffman codes for the following cases and comment on the efficiency and easure of the variability in the code-word length of a source code.
i) Moving probability of combined symbol as high as possible.
ii) Moving probability of combined symbol as low as possible.
10 M
3(b) The noise characteristics of a non-symmetric binary channel is given in Fig. Q3 (b).
i) Find \( H(X),H(Y)+H\left ( \dfrac{X}{Y} \right )\ \text{and}\ H\left ( \frac{Y}{X} \right ).\) Given \( P(x_1)=\dfrac{1}{4},P(x_2)=\dfrac{3}{4},\alpha=0.75,\beta=0.9\)
ii) Also find the capacity of the channel with rs = 1000 symbols/sec.
10 M
3(b) Obtain an equation for channel capacity of a symmetric or uniform channel.
6 M
3(c) A source has an alphabet consisting of seven symbols A, B, C, D, E, F & G with probabilities of \( \dfrac{1}{4},\dfrac{1}{4},\dfrac{1}{8},\dfrac{1}{8},\dfrac{1}{8},\dfrac{1}{16}\ \text{and}\ \dfrac{1}{16}\) respectively. Construct Huffman Quarternery code. Find coding efficiency.
5 M
3(c) What are deterministic channels? Obtain an equation for channel capacity of such channel consists of a noise matrix with 3 rows and 4 columns.
4 M

4(a) State Shannon-Hartley theorem and explain its implications.
8 M
4(a) State and explain Shannon-Hartley law.
5 M
4(b) A Gaussian channel has a bandwidth of 4 kHz and a two-side power spectral density \( \dfrac{\eta }{2}\ \text{of}\ 10^{-14}\ \text{watts/Hz}.\) The signal power at the receiver has to be maintained at a level less than or equal to \( \left ( \dfrac{1}{10} \right )^{th}\) of milliwatt. Calculate the capacity of this channel.
6 M
4(b) Explain properties of mutual information.
10 M
4(c) Explain the properties of mutual information.
6 M
4(c) For an additive white Gaussian noise channel with 4 kHz of band width and noise spectral density \( \left ( \dfrac{N_0}{2} \right )\ \text{is}\ 10^{-12}\ \text{Watts/Hz},\) calculate the capacity of the channel. The signal power required at the receiver is 0.1 MW.
5 M

5(a) What are the types of errors and types of codes in error control coding?
4 M
5(a) Describe the important aspects of error control coding.
4 M
5(b) Consider a (6, 3) linear code whose generator matrix is, \( G=\begin{bmatrix} 1 & 0 & 0 & 1 & 0 & 1\\ 0 & 1 & 0 & 1 & 1 & 0\\ 0 & 0 & 1 & 0 & 1 & 1 \end{bmatrix}\)
i) Find all code vectors.
ii) Find all the Hamming weights.
iii) Find minimum weight parity check matrix.
iv) Draw the encoder circuit for the above codes.
10 M
5(b) With the help of a neat block diagram, explain the working of syndrome decoder circuit.
6 M
5(c) The parity check bits of a (7, 4) Hamming code are generated by,
C5 = d1 + d3 + d4 ;     C6 = d1 + d2 + d3 ; C7 = d2 + d3 + d4
When d1, d2, d3 & d4 are the message bits.
i) Find generator matrix and parity check matrix.
ii) Prove that GHT = 0.
6 M
5(c) For linear block code (6, 3), the here parity bits are given:
c4 = d1 ⨁ d2 + d3 ; c5 = d1 ⨁ d2 ; c6 = d1 ⨁ d3
i) Construct generator matrix
ii) Determine all code vectors and draw encoder circui.
iii) Determine error detecting and correcting capability,
iv) Prove GHT = 0.
10 M

6(a) Define Binary cyclic codes. Explain the properties of cyclic codes.
8 M
6(a) What is a binary cyclic code? Explain the procedure of obtaining generator matrix of (7, 4) cyclic codes from the generator polynomial g(x) = x3 + x + 1 in systematic form.
10 M
6(b) A (15, 5) linear cyclic code has a generator polyaomial,
g(x) = 1 + x + x2 + x4 + x5 + x8 + x10
i) Draw the block diagram of an encoder for this code g(x) = 1 + x + x2 + x4 + x5 + x8 + x10
ii) Find the code vector for the message polynomial D(x) = 1 + x2 + x4 in systematic form.
iii) Is V(s) = 1 + x4 + x6 + x8 + x14 a code polynomial?
12 M
6(b) With the help of neat diagram, explain syndrome calculation circuit of binary cyclic codes, while employ (n-k) shift registers
8 M
6(c) Explain linearity and cyclic property of cyclic codes.
2 M

7(a) A rate \( \dfrac{1}{3}\) convolutional encoder has generator polynomial vector g1 = 100, g2 = 111, g3 = 101. Draw encoder diagram. If input is 10110, determine output sequence using transform domain approach.
10 M
Write short notes on:
7(a)(i) BCH codes.
5 M
7(a)(ii) RS codes.
5 M
7(a)(iii) Golay codes.
5 M
7(a)(iv) Brust error correcting codes.
5 M
7(b) For the convolutional encoder shown in Fig. Q7 (b),
i) Find code rate and constraint length.
ii) Draw tree diagram, trellis diagram, &alpha - state diagram.
:!mage
10 M

8(a) What are convolutional codes? Explain encoding of convolutional codes using transform domain approach.
8 M
8(a) What are cyclic redundancy check codes? Explain how CRC codes are capable of detecting errors.
10 M
8(b) Consider the (3, 1, 2) convolutional code with g(1) = (1 1 0), g(2) = (1 0 1) and g(3) = (1 1 1)
i) Draw the encoder block diagram.
ii) Find the generator matrix.
iii) Find the code corresponding to the information sequence (1 1 1 0 1) using time domain approach.
12 M
8(b) Write short notes on the parameters involved in the error correcting capcbility of Bose-Chaudhuri-Hocquenghem codes and Reed-Solomen codes.
10 M



More question papers from Information Theory & Coding
SPONSORED ADVERTISEMENTS