ME/MATH 577
Stochastic Systems in Science and Engineering
Home Work Set #04
Problem 04.01
Let X be a random variable. Let λ ∈ R and t ∈ [0, ∞).
( i) Show that P [X ≥ λ] ≤ e−λt θX (t), where θX (t) is the moment generating
function associated with X. Find the Chernoff bound of the probability
P [X ≥ λ] by minimizing the right hand side with respect to the continuous
parameter t.
( ii) Let X be the Gaussian random variable N (m, σ 2 ). Find the corresponding
Chernoff bound.
−λ k
( iii) Let X be the Poisson random variable PX (k) = e k!λ , k = 0, 1, 2, · · · . Find
the corresponding Chernoff bound for P [X ≥ k], where k > λ.
( iv) Let {Xk } be a sequence of i.i.d. random variables with mean m and standard
∑N
deviation σ. Let m be estimated as the sample mean m(N b ) , N1 k=1 Xk .
Having computed the mean and variance of the random variable m̂(N ), iden-
b ) − m| > 0.1σ] ≤ 0.0001.
tify the smallest N that will satisfy P [|m(N
Problem 04.02
Let {Xk } be a sequence of random variables and let u(•) be the unit step function.
( i) Let the sequence {Xk } of continuous random variables be jointly independent
having the respective pdf’s
( ) [ ]
1 1 1 ( n − 1 )2 σ
fX (x; n) = 1 − √ exp − 2 x − σ + e−σx u(x)
n 2π σ 2σ n n
Find whether or not the random sequence {Xk } converges in the sense of:
• mean square.
• in probability.
• in distribution.
1
( ii) Let the members of the sequence {Xk } of continuous random variables have
joint pdf’s in the following form with m, n ∈ N and ρ ∈ (0, 1):
mn [ 1 ( 2 2 )]
fX (α, β; m, n) = √ exp − m α − 2ρmα nβ + n2 β 2
2π 1 − ρ2 2(1 − ρ )
2
• Show that {Xk } converges in the mean square sense for all ρ ∈ (0, 1).
• Identify the probability distribution function of the mean-square limit
of {Xk }.
• State the conditions under which the mean-square limit of a sequence
of Gaussian random variables is also Gaussian.
( iii) This problem demonstrates that p-convergence implies convergence in distri-
bution even when the limiting pdf does not exist. Let the random sequence
{Xk } converge to the random variable X in probability.
• Show that, for any x ∈ R and any ε ∈ (0, ∞),
P [X ≤ (x − ε)] ≤ P [Xk ≤ x] + P [|Xk − X| ≥ ε]
• Show that, for any x ∈ R and any ε ∈ (0, ∞),
P [X > (x + ε)] ≤ P [Xk > x] + P [|Xk − X| ≥ ε]
• Show that, as k → ∞, FX (x; n) → FX (x) at points of continuity of FX .
Problem 04.03
Let {Wk } be an i.i.d. random sequence with mean 0 and variance σW
2
. Let {Wk }
be a random sequence defined, n ∈ N as:
X0 = 0 and Xn = ρXn−1 + Wn
( i) Find the mean of Xn for n ≥ 0.
( ii) Find the covariance of Xn , denoted as KXX [m, n] for m, n ≥ 0.
( iii) Determine, for what values of ρ does KXX [m, n] tend to be a finite-valued
function function g[m − n] as m and n become simultaneously large? This
situation is called asymptotic stationarity.
Problem 04.04
Let βt be the standard Wiener process on [0, ∞) with distribution N (0, t) at time
t.
( i) Find the joint density fβ (a1 , a2 ; t1 , t2 ) for 0 < t1 < t2 .
( ii) Find the conditional density fβ (a1 |a2 ; t1 , t2 ) for 0 < t1 < t2 .
2
Problem 04.05
Let {Xk } be a Martingale sequence on k ≥ 0.
( i) For all k ≥ 0 and m ≥ 0, show that E[Xk+m | Xm · · · X0 ] = Xm .
( ii) Let {Yk } be a random sequence and let X be a random variable. Let Gk ,
E[X | Y0 · · · Yk ]. Show that {Gk } is a Martingale sequence on k ≥ 0.
( iii) This problem expands the concept of a Martingale sequence that was in-
troduced in the context of Martingale
( convergence ) Theorem. Let g be any
measurable function and let Gk , g X0 , · · · , Xn ∀n ≥ 0. Then, G is called
a Martingale with respect to X if
E[Gn+1 | Xn , · · · , X0 ] = Gn
[ ]
n| ]
2
• Show that P max0≤k≤n | Gk |≥ ε ≤ E[|G ε2 ∀ε > 0 ∀n ∈ N.
• Show that Martingale convergence Theorem holds for G that is a Mar-
tingale with respect to X.
[Hint: Follow the proof of Martingale convergence Theorem in Chapter 02 of
class notes.]