0% found this document useful (0 votes)
107 views2 pages

Binary Channel Homework Solutions

(1) The document is the homework assignment for an information theory and coding class. It contains 3 problems regarding binary symmetric channels, cascaded channels, and computing differential entropy for various probability distributions. (2) Problem 1 involves finding joint and marginal distributions, entropy calculations, conditions for typical sequences, and the number of sequences that are jointly typical. (3) Problem 2 examines a cascaded channel between Alice and Bob that goes through Eve, and calculates the capacity in different scenarios. (4) Problem 3 asks to compute the differential entropy for exponential, Pareto, and Laplace distributions.

Uploaded by

Sanjay Kumar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
107 views2 pages

Binary Channel Homework Solutions

(1) The document is the homework assignment for an information theory and coding class. It contains 3 problems regarding binary symmetric channels, cascaded channels, and computing differential entropy for various probability distributions. (2) Problem 1 involves finding joint and marginal distributions, entropy calculations, conditions for typical sequences, and the number of sequences that are jointly typical. (3) Problem 2 examines a cascaded channel between Alice and Bob that goes through Eve, and calculates the capacity in different scenarios. (4) Problem 3 asks to compute the differential entropy for exponential, Pareto, and Laplace distributions.

Uploaded by

Sanjay Kumar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

EE5581 - Information Theory and Coding UMN, Fall 2018, Nov.

15

Homework 5
Instructor: Soheil Mohajer Due on: Nov. 21, 2:30 PM

Problem 1

We consider a binary symmetric channel with crossover probability p = 0.1. The input distribution that achieves
capacity is the uniform distribution [i.e., p(x) = (1/2, 1/2)].

(a) Find the joint distribution p(x, y), and the marginal distribution p(y).

(b) Calculate H(X), H(Y ), H(X, Y ), and I(X; Y ).

(c) Let X1 , X2 , . . . , Xn be drawn i.i.d. according to p(x). Of the 2n possible input sequences of length n, which
(n)
of them are typical [i.e., member of A (X)] according to  = 0.2?

(d) Let xn and y n be two binary sequences of length n. Write the condition(s) for xn and y n to be jointly typical
(n)
[i.e., (xn , y n ) ∈ A (X, Y )].

(e) Let k be the number of places in which the sequence xn differs from y n (k is a function of the two sequences).
Write p(xn , y n ) in terms of p and k, and from that rewrite the condition in (d) in terms of p and k.

(f) An alternative way at looking at this probability is to look at the binary symmetric channel as in additive
channel Y = X ⊕ Z, where Z is a binary random variable that is equal to 1 with probability p, and is
independent of X. Rephrase the joint typicality condition of (xn , y n ) in terms of conditions on xn and z n .
(n)
(g) For a given typical sequence xn ∈ A (X), find (approximately) the number of sequences Y n such that
(n)
(xn , Y n ) ∈ A (X, Y ).

Problem 2

Alice wants to communicate to Bob. However, there is no direct channel between Alice and Bob, and they can
communicate by asking Eve to help: Alice encodes w to some binary sequence x = (x1 , x2 , . . . , xn ), and sends it
to Eve over a binary symmetric channel CA→E with parameter p. There is also another binary symmetry channel
CE→B between Eve and Bob with parameter q.

(a) Assume Eve’s contribution is to just forward the binary symbols she receives at the output of CA→E to the
inputs for the second channel. Find the joint probabilities between bits sent by Alice (XA ) and those received
by Bob (YB ). Find the capacity of this cascaded channel.

(b) Compare the capacity you found in (a) with the capacities of channels CA→E and CE→B .

(c) How can you improve the capacity of end-to-end channel from Alice to Bob? Can Eve be more helpful?

1
Problem 3

Compute the differential entropy for the following probability density functions:

(a) Exponential distribution: f (x) = λ exp(−λx) for x ≥ 0.


aka
(b) Pareto distribution: f (x) = xa+1
for x ≥ k > 0 and a > 0.

(c) Laplace distribution: f (x) = λ2 e−λ|x−θ| for −∞ < x < ∞ and λ > 0.
Hint: You can start with θ = 0.

You might also like