Quiz2 Solutions
Quiz2 Solutions
1 ∞ −x
Z
−(4x + 3)e−x + (x + 3) dx
= e
4 0
Z ∞ Z ∞
1 −2x −x
= − (4x + 3)e dx + (x + 3)e dx
4 0 0
∞ Z ∞ ∞ Z ∞
e−2x e−x
1 −2x −x
= − (4x + 3) · −2 e dx + (x + 3) + e dx
4 −2 0 0 −1 0 0
−2x ∞ −x ∞
1 3 e e
= − −2 +3+
4 2 −2 0 −1 0
1 3 3
= − − 1 + 3 + 1 = = 0.375.
4 2 8
1
2. Let X and Y be independent random variables. Asume that X has doubly exponential
distribution with parameter λ > 0, that is,
λ −λ|x|
fX (x) = ·e , x ∈ R,
2
and Y has Gamma distribution with parameters 2 and λ > 0, that is,
λ2 · y · e−λy if y > 0;
fY (y) =
0 otherwise.
Solution. Since X and Y are independent, to compute the density of X + Y we can use
the convolution formula
Z ∞
fX+Y (z) = fX (x) · fY (z − x) dx
−∞
Z z
λ −λ|x| 2
= ·e · λ · (z − x) · e−λ(z−x) dx.
−∞ 2
For z 6 0 we have
z
λ3
Z
fX+Y (z) = · eλx · (z − x) · e−λz+λx dx
2 −∞
z
λ3 −λz
Z
= ·e · (z − x) · e2λx dx
2 −∞
z Z z 2λx
λ3 −λz e2λx
e
= ·e · (z − x) · + dx
2 2λ −∞ −∞ 2λ
2λx z
λ3 −λz e
= ·e ·
2 (2λ)2 −∞
λ3 eλz λ
= · 2 = · eλz .
2 4λ 8
For z > 0, we split the integral as
λ3 0 λx λ3 z
Z Z
−λ(z−x)
fX+Y (z) = e · (z − x) · e dx + (z − x) · e−λz dx.
2 −∞ 2 0
2
Now, the first integral gives
λ3 0 λx
Z
e · (z − x) · e−λ(z−x) dx
2 −∞
Z 0 Z 0
λ3 −λz
2λx 2λx
= ·e · z e dx − x · e dx
2 −∞ −∞
2λx 0 0 Z 0 2λx !
λ3 −λz e2λx
e e
= ·e · z − x· + dx
2 2λ −∞ 2λ −∞ −∞ 2λ
2λx 0 !
λ3 −λz z e
= ·e · +
2 2λ (2λ)2 −∞
λ3 −λz λ3 −λz
z 1 z 1
= ·e · + = ·e · + .
2 2λ 4λ2 4 λ 2λ2
3
3. Let X and Y be independent discrete random variables, each with Geometric distribution
of parameter p ∈ (0, 1), that is, the probability mass functions pX and pY of X and Y
are
(1 − p)k−1 · p if k ∈ N;
pX (k) = pY (k) =
0 otherwise.
Mark all correct answers.
Solution. Since X and Y are independent, we can apply the discrete convolution formula,
which gives
X
P(X + Y = k) = pX (j) · pY (k − j)
j
k−1
X
= (1 − p)j−1 · p · (1 − p)k−j−1 · p
j=1
k−1
X
2
=p · (1 − p)k−2
j=1
= p2 · (1 − p)k−2 · (k − 1).
∞
!2
X
= p2 · (1 − p)j−k+k−1
j=k
∞
!2
X
= p2 · (1 − p)2(k−1) · (1 − p)`
`=0
2
2 2(k−1) 1
= p · (1 − p) · = (1 − p)2(k−1) .
1 − (1 − p)
4
Hence,
= (1 − p)2(k−1) − (1 − p)2k
= (1 − p)2(k−1) · 1 − (1 − p)2
k−1
= (1 − p)2 · 1 − (1 − p)2 .
5
4. Let X be a continuous random variable with log-normal distribution, that is,
√1 · e− 12 (log x)2 if x > 0;
x 2π
fX (x) =
0 otherwise.
√
Determine the value of A = 2π · E[| log X|].
Solution. We have
√ √ Z ∞
| log x| − 1 (log x)2
A= 2π · E[|X|] = 2π · √ ·e 2 dx
0 x 2π
Z ∞
1 2
= |y| · e− 2 y dy
−∞
Z ∞ Z 0
− 12 y 2 1 2
= y·e dy − y · e− 2 y dy
0 −∞
Z ∞ Z ∞
1 2 1 2
= z · e− 2 z dz + z · e− 2 z dz
0 0
Z ∞
1 2
=2 z · e− 2 z dz
0
∞ ∞
e−w/2
Z
= e−w/2 dw = = 2.
0 −1/2 0
6
5. Let X be a discrete random variable with Binomial distribution with parameters 4 and p ∈
(0, 1). Consider the random variable Y defined as
Y = cos( 12 πX).
but both p = 0 and p = 1 are excluded. Hence, there is no value of p ∈ (0, 1) such that Y
is discrete uniform on [−1, 1].
Next, we compute
7
Alternative (6) is not true for all values of p. To see this, note that
8
6. Let X be a continuous random variable with probability density function given by
3 (1 − x2 ) if x ∈ (−1, 1);
4
fX (x) =
0 otherwise.
Hence,
E[X] = 0
and
2 32 2 2 3 10 − 6 1
Var(X) = E[X ] − E[X ] = − = = = 0.2.
4 3 5 4 15 5
9
7. Let X and Y be discrete random variables with joint mass function
4 x2 (y + 2) if x = 1 , 1 and y = −1, 2;
25 2
pX,Y (x, y) =
0 otherwise.
Solution.
(e) We have
pX,Y (x, y) 1
pY (y) = = (y + 2).
pX (x) 5
Then,
1 1 1 8 7
E[Y ] = − (−1 + 2) + 2 · (2 + 2) = − + = .
5 5 5 5 5
(f) Since X and Y are independent,
63
E[XY ] = E[X] · E[Y ] = .
50
Correct alternatives: a, b, d, f
Grading: +2.5 for marking each correct alternative, -2.5 for marking each incorrect al-
ternative (but grade cannot be negative).
10
8. Let X and Y be jointly continuous random variables with joint density
1 (x + y) if x ∈ (0, 1) and y ∈ (0, 2);
3
fX,Y (x, y) =
0 otherwise.
11
9. Let X1 , . . . , X10 be independent discrete random variables such that for all n,
1
16
if k = 0;
1 if k = 1;
4
pXn (k) =
1
2
if k = 2;
3 if k = 3.
16
Compute E[Y ].
Solution. We have, for each n,
1 1 1 3 1 9 29
E[Xn ] = 0 · +1· +2· +3· = +1+ = .
16 4 2 16 4 16 16
Then, " #
10 10
X X 29 290
E[Y ] = E Xn = E[Xn ] = 10 · = = 18.125
n=1 n=1
16 16
.
12
10. Let X be a Gaussian random variable with mean 0 and variance 4, that is,
1 2
fX (x) = √ · e−x /8 , x ∈ R.
8π
Let Y = X 2 .
Recall that if Z is a Gaussian random variable with mean µ and variance σ 2 then
Z ∞ Z ∞
1 (z−µ)2
1= fZ (z) dz = √ · e− 2σ2 dz.
−∞ 2πσ 2 −∞
Mark all the correct answers.
(a) The moment-generating function MY (t) of Y is well defined and finite for all t ∈ R.
(b) The moment-generating function MY (t) of Y is well defined and finite for t < 81 .
(c) E[Y ] = 4.
(d) Var(X) = 16.
(e) None of the above.
Now note that the expression inside the integral is the density function of a Gaussian
4
with mean 0 and variance σ 2 = 1−8t , so the integral equals 1. Therefore,
1
MY (t) = √ .
1 − 8t
To determine the moments, we compute the derivatives of MY :
1 8 4
MY0 (t) = · 3/2
= ,
2 (1 − 8t) (1 − 8t)3/2
3 8 58
MY00 (t) = 4 · · 5/2
= .
2 (1 − 8t) (1 − 8t)5/2
13
Then,
E[Y ] = MY0 (0) = 4, E[Y 2 ] = MY00 (0) = 58,
so
Var(Y ) = E[Y 2 ] − E[Y ]2 = 58 − 16 = 42.
Correct alternatives: b, c
Grading: 0 points if “None of the above” was marked. Marking (b) and (c) is worth 5
points each, marking (a) and (d) is worth -5 points each (but grade cannot be negative).
14