ECE (435/535a) Spring 2016 Midterm
March 8
Name: Signature:
Instructions: Answer all questions and show all work. Answers which are not justified with appropriate work will receive 0
points. Students who cheat will receive zero points on the exam and will be subject to the university’s disciplinary procedure
for academic dishonesty. Cheating includes, but is not limited to, collaborating of conferring in any way with anyone. Use of
the internet is strictly forbidden. Your signature above attests that you are in compliance with these rules.
1. Source Coding
1 2 3 4 5 6
(a) (20 points) Design a ternary Huffman code for a source with symbol probabilities p = { 21 , 21 , 21 , 21 , 21 , 21 }.
logb x
Hint: loga x = .
logb a
(b) (15 points) Is the average codeword length for such a source larger than the source entropy?
Problem Points Student’s Score
1 35
2 35
3 30
4 20
5 10
Total: 130
2. Consider an additive white Gaussian channel with output Y , where Y is the sum of the input X and the noise Z,
2 2
Y = X + Z. The input X and the noise Z are drawn from Gaussian distributions N (0, σX ) and N (0, σN ) respectively.
(a) (5 points) Find the distribution of the output Y .
(b) (10 points) Find the mutual information between X and Y .
2
(c) (20 points) [For 535a students only] Show that the input distribution N (0, σX ) maximizes I(X; Y ) for inputs with
2
finite power (E[X ] ≤ P ).
Page 2
.
Page 3
3. Let X be a binary memoryless source producing symbols with probability p(X = 0) = p and p(X = 1) = 1−p. The source
output is transmitted over a Gaussian channel, Y = X + Z, where the noise Z is drawn from a Gaussian distribution
2
N (0, σN ).
(a) (15 points) Write an expression for I(X; Y ).
(b) (15 points) Let X be a memoryless source with letters x1 , x2 , . . . , xM and the corresponding probabilities p1 , p2 , . . . , pM .
Find the mutual information between X and Y for the given Gaussian channel.
Page 4
4. (20 points) (Extra-Credit) A channel with m input and n output symbols is said to be symmetric if its channel matrix has
the property that its each row p = (p1 , p2 , . . . , pn ) is a permutation of another row, and each column q = (q1 , q2 , . . . , qm )
is a permutation of another column. Derive the expression for the channel capacity of such a symmetric channel.
(Hint: prove first that the conditional entropy H(Y |X) is independent of the input probability distribution).
Page 5
5. (Extra-Credit) According to the channel capacity of the band-limited continuous channel with signal power Pb , noise
variance N0 and bandwidth W : C = W log(1 + NP0 bW ), answer the following questions with justification.
C Pb
(a) (5 points) Plot W as a function of N0 . (Hint: Pb = CEb , where Eb is the signal energy per bit.)
(b) (5 points) What is the minimum power P ∗ that is required to send 1 bit reliably?
Page 6