0% found this document useful (0 votes)
56 views6 pages

B.Tech Degree Examination: Sixth Semester Branch: Information Technology

This document contains two model question papers for an Information Theory and Coding examination. Each paper has three parts - Part A contains 5 short answer questions worth 3 marks each, Part B contains 4 short answer questions worth 5 marks each, and Part C contains 2 long answer questions worth 12 marks each. The questions cover topics such as information entropy, channel capacity, error correcting codes, Huffman coding, and convolutional codes.

Uploaded by

Abhishek Prakash
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
56 views6 pages

B.Tech Degree Examination: Sixth Semester Branch: Information Technology

This document contains two model question papers for an Information Theory and Coding examination. Each paper has three parts - Part A contains 5 short answer questions worth 3 marks each, Part B contains 4 short answer questions worth 5 marks each, and Part C contains 2 long answer questions worth 12 marks each. The questions cover topics such as information entropy, channel capacity, error correcting codes, Huffman coding, and convolutional codes.

Uploaded by

Abhishek Prakash
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
You are on page 1/ 6

IT010 603 (Pages : 3) Reg.

No _______________
Name _________________
B.TECH DEGREE EXAMINATION
!"#$ emes#e%
B%a&'$: I&(o%ma#!o& Te'$&o)og*
IT 010 603 + IN,ORMATION THEOR- AND CODING
(Model Question Paper-1)
T!me: T$%ee Ho.%s Ma"!m.m : 100 Ma%/s
Pa%# 0 A
Answer all questions.
Each question carries 3 marks
1. Relate the amount of information provided and probability of occurrence of events.
2. What is the channel capacity of a binary symmetric channel ith error probability !.!1"
#. $efine the terms codin% efficiency and redundancy.
&. What is source codin%" $efine code len%th ' code efficiency. (ive the relation beteen it.
). What is meant by stop-and-ait *RQ" +,plain.
(1 2 3 3 11 ma%/s)
Pa%# 0 B
Answer all questions.
Each question carries 5 marks
-. * discrete source emits one of five symbols once every milliseconds ith probabilities
1.2/ 1.&/ 1.0/ 1.1- and 1.1-. 1ind the source entropy and information rate.
2. 3tate and e,plain 3hannon 4artley theorem.
0. $efine ( and 4 matri, and sho that ( . 4
5
6 !.
7. 8ame anyone codin% method used for burst error correction. +,plain.
1!. What is interleavin%" +,plain about bloc9 interleavin%.

(1 2 1 3 41 ma%/s)
PART 0 C
Answer any one full question from each module.
Each question carries 12 marks
11. :onsider that to sources emit messa%es ,l/ ,2/ ,# and y1/ y2/ y# ith the ;oint probabilities
p (X, Y) as shon in the matri, form<

(i) :alculate the entropies of = and >. (& mar9s)
(ii) :alculate the ;oint and conditional entropies/ 4 (=/>)/ 4 (=.>)/ 4(>.=)
beteen = and >. (- mar9s)
(iii) :alculate the avera%e mutual information ?(=@>). (2 mar9s)
Or
12. (a) $efine (i) $iscrete entropy 4 (=) and ;oint entropy 4 (=/>) and
(ii) Mutual information ?(=@>). (- mar9s)
(b) 3ho that ? (=@>) 6 4 (=)A 4 (>) - 4 (=/>). (- mar9s)
1#. $erive an e,pression for the capacity of a band-limited *W(8 channel
Or
1&. * B3: has the error probability p 6 !.2 and the input to the channel consists of &
eCuiprobable messa%es ,l 6 !!! @ ,2 6 !!1 @ ,# 6 !11 @ ,& 6 111. :alculate
(a) p(!) and p (1) at the input@
(b) +fficiency of the code@
(c) :hannel capacity.
1). * Memory less source emits si, messa%es ith probabilities D!.&/ !.2/ !.2/ !.1/ !.1E. 1ind
the 3hannon - 1ano code and determine its efficiency.
Or
1-. :onstruct the 4uffman code ith minimum code variance for the folloin% probabilities
and also determine the code variance and code efficiency<
D!.2)/ !.2). !.12)/ !.12)/ !.12)/ !.!-2)/ !.!-2)E
12. :onsider a (-/#) linear bloc9 code hose %enerator matri, is %iven by
(i) 1ind the parity chec9 matri,. (2 mar9s)
(ii) 1ind the minimum distance of the code. (& mar9s)
(iii) $ra the encoder and syndrome computation circuit. (- mar9s)

Or
10. * (2/ &) cyclic code has a %enerator polynomial< %(=) 6 =
#
A = A 1.
(i) $ra the bloc9 dia%ram of encoder and syndrome calculator.
(ii) 1ind %enerator and parity chec9 matrices in systematic form.
17. (a) +,plain the or9in% of (2/1/#) convolutional encoder usin% transform
domain approach. (- mar9s)
(b) +,plain 3eCuential decodin%. (- mar9s)
Or
2!. (a) +,plain the ma,imum li9elihood decodin% and viterbi decodin% al%orithms of a
convolution encoder. (0 mar9s)
(b) $escribe about bloc9 and :onvolutional interleavin%. (& mar9s)

(1 2 14 3 60 ma%/s)
IT010 603 (Pages : 3) Reg. No _______________
Name _________________
B.TECH DEGREE EXAMINATION
!"#$ emes#e%
B%a&'$: I&(o%ma#!o& Te'$&o)og*
IT 010 603 + IN,ORMATION THEOR- AND CODING
(Model Question Paper-2)
T!me: T$%ee Ho.%s Ma"!m.m : 100 Ma%/s
Pa%# 0 A
Answer all questions.
Each question carries 3 marks
1. $efine (i) Foint entropy@ and (ii) :onditional entropy.
2. 39etch the transition dia%ram of a binary erasure channel and e,plain.
#. What are instantaneous codes"
&. 1ind the %enerator and parity chec9 matrices of a (2/ &) cyclic code ith %enerator
polynomial % (=) 6 1 A = A =
#
.
). What is meant by constraint len%th and free distance of a convolution code"
(1 2 3 3 11 ma%/s)
Pa%# 0 B
Answer all questions.
Each question carries 5 marks
-. $efine Mutual ?nformation. +,plain ho it is related to entropy for a lossless channel/ prove
that 4(=.>) 6 !.
2. (ive the relation beteen channel capacity :/ bandidth W and si%nal to noise ratio 3.8 of
a *W(8 channel. +,plain the trade-off beteen them.
0. $efine B:4 code and brief about Reed-3olomon code.
7. +,plain briefly the syndrome calculation circuit for (n/9) cyclic code.
1!. Briefly describe the steps of Giterbi al%orithm.
(1 2 1 3 41 ma%/s)
PART 0 C
Answer any one full question from each module.
Each question carries 12 marks
11. (a) Prove that the entropy for a discrete source is ma,imum hen the output symbols are
eCually probable.
(b) 1ind the discrete entropy for the source ith symbol probabilities
D!.#/ !.2)/ !.2/ !.1)/ !.1E
Or
12. (a) $efine mutual information ? (= @ >) and sho that ? (= @ >) H !.
(b) $erive the relationship beteen entropy and mutual information.
1#. (a) 1rom channel capacity theorem/ find the capacity of a channel ith infinite
bandidth and e,plain. (- mar9s)
(b) 5o binary symmetric channel ith error probability !.1/ are cascaded as shon
belo and P (!) 6 !.2).:alculate ? (=/>)and ? (=/I).
(- mar9s)
>
= I
Or
1&. (a) 3tate 3hannon-4artley theorem and from that derive 3hannonJs theoretical limit.
(2 mar9s)
(b) What is meant by optimum modulation system" +,plain. () mar9s)
1). (a) 3tate and prove KraftJs ineCuality. (2 mar9s)
(b) +,plain about arithmetic codin%. () mar9s)
Or
1-. (a)(iven ,i6D,1/ ,2/ ,#/ ,&/ ,)/ ,-Eith probabilities p(,i)6 D!.#/ !.2)/ !.2/ !.12/ !.!0/ !.!)E.
Ma9e 4uffman code. 1ind efficiency of this code. (0 mar9s)
(b) What is I?P codin%" +,plain. (& mar9s)
B3:-1 B3:-2
12. (a) +,plain the encodin% method of a (2/ &) linear bloc9 code. (- mar9s)
(b) $efine %enerator and parity chec9 matrices of a (2/ &) linear bloc9 code. +,plain ho
to %enerate a linear bloc9 code usin% (-matri,. +,plain ith an e,ample. (- mar9s)
Or
10. :onstruct a symmetric (2/&) cyclic code usin% the %enerator polynomial %(,)6 ,
#
A,A1.
What are the error correctin% capabilities of this code" 1or the received ord 11!11!!/
determine the transmitted codeord.
17. :onsider (#/1/2) convolutional code ith %(1) 6 (11!)/ %(2) 6 (1!1) and %(#) 6 (111)<
(i) $ra the encoder bloc9 dia%ram.
(ii) 1ind the %enerator matri,.
(iii) 1ind the code ord correspondin% to the information seCuence (111!1)
usin% time domain approach.
Or
2!. +,plain *RQ in detail.
(1 2 14 3 60 ma%/s)

You might also like