0% found this document useful (0 votes)
16 views8 pages

ST 119

The document outlines the exam details for the University of Warwick's Probability 2 course (ST119), including the exam format, allowed materials, and instructions. It contains various mathematical problems related to random variables, including calculations of expectations, variances, and probabilities. Additionally, it discusses concepts such as covariance, correlation, and inequalities related to random variables.

Uploaded by

Biscuit1601
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
16 views8 pages

ST 119

The document outlines the exam details for the University of Warwick's Probability 2 course (ST119), including the exam format, allowed materials, and instructions. It contains various mathematical problems related to random variables, including calculations of expectations, variances, and probabilities. Additionally, it discusses concepts such as covariance, correlation, and inequalities related to random variables.

Uploaded by

Biscuit1601
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 8

ST1190

ST1190

UNIVERSITY OF WARWICK

Paper Details

Paper code: ST119

Paper Title: PROBABILITY 2

Exam Period: Summer 2024

Exam Rubric

Time Allowed: 2 hours plus 15 minutes reading time


During the 15 minutes reading time, you are not permitted to begin answering
questions in the answer booklet. You may read and make notes on the question
paper.

Exam Type: Standard Examination

Approved calculators may be used.

Additional Stationery

Cambridge Statistical Tables

Instructions

Full marks may be obtained by correctly answering all questions.


There are a total of 100 marks available.
A guideline to the number of marks available is shown for each question section.
Be careful in crossing out work. Crossed out work will NOT be marked.

Page 1 of 8
ST1190

1. (a) Let X and Y be two random variables with E(X) = 50, Var(X) = 100, E(Y ) =
60, Var(Y ) = 400 and Cov(X, Y ) = 150. Two other random variables, W and
V , are defined by W = X − Y and V = X + Y . Find E(W ), Var(W ), E(V ),
Var(V ) and Cov(W, V ).
[9 marks]

E(W ) = E(X) − E(Y ) = 50 − 60 = −10


[2 marks]
Var(W ) = Var(X) + Var(Y ) − 2Cov(X, Y ) = 100 + 400 − 300 = 200
[2 marks]
E(V ) = E(X) + E(Y ) = 50 + 60 = 110
[1 mark]
Var(V ) = Var(X) + Var(Y ) + 2Cov(X, Y ) = 100 + 400 + 300 = 800
[2 marks]
Cov(W, V ) = Var(X) − Var(Y ) = 100 − 400 = −300
[2 marks]

(b) F is the distribution function of a random variable X and is defined by




 0 if x < 0

1/4 if 0 ≤ x < 2/3



F (x) = 7/12 if 2/3 ≤ x < 1

2/3 if 1 ≤ x < 2





1 if x ≥ 2

(i) Define the term “distribution function”.

If X is a random variable then the distribution function of X is defined by


F (x) = P(X ≤ x) for x ∈ R.
[2 marks]

(ii) Find f , the probability mass function of X.

f (0) = 1/4, f (2/3) = 1/3, f (1) = 1/12, f (2) = 1/3.


[4 marks]

(iii) Compute P(X > 2/3) and E(X).

P(X > 2/3) = 1 − F (2/3) = 5/12.


[2 marks]
1 1 2 1 2 59
E(X) = 4
×0+ × +1×
3 3 12
+2× 3
= 36
= 1.64.
[2 marks]

Question 1 continued overleaf

Page 2 of 8
ST1190
Question 1 continued

[10 marks]
(c) Let Y be a random variable with probability density function f given by
(
c
(1+y)4
if y > 0
f (y) =
0 otherwise,

for some constant c.


(i) What is the distribution function of Y [give your answer in terms of c]?
Rt c
For t ≥ 0: F (t) = 0 (1+x)4
dx = 13 c[1 − 1
(1+t)3
]; for t < 0, F (t) = 0.
[2 marks]

(ii) What is the value of c?

F (+∞) = 13 c = 1 and so c = 3.
[2 marks]

(iii) Compute P(1 < Y < 3), E(Y + 1) and E((Y + 1)2 ). Deduce Var(Y ).
 3
R4 3 1 1 1 7
P(1 < Y < 2) = 1 (1+x)4
dx = − (1+x) 3 = 8
− 64
= 64
.
1
R∞ 3 [2 marks]
3
E(Y + 1) = 0 (1+x) 3 dx = 2

and R∞ 3
E((Y + 1)2 ) = 0 (1+x)2 dx = 3.

[4 marks]
9
Hence Var(Y ) = Var(Y + 1) = 3 − 4
= 34 .

[11 marks]
[Total Marks 30]

Question 2 starts on next page

Page 3 of 8
ST1190

2. (a) The random variable X is discrete, with probability mass function f , while
the random variable Y is continuous with probability density function g. Give
expressions for the expectation of ϕ(X) and ϕ(Y )
[4 marks]

 X
E(ϕ X) = ϕ(x) · f (x)
x∈supp(X)


Z ∞
E ϕ(Y ) = ϕ(y) · g(y) dy
−∞

(b) Suppose X1 , . . . , Xn are independent exponential random variables with Xi


being Exp(θi ) for i = 1. . . . , n. The random variables Y and S are defined by

Y = min(X1 , . . . , Xn ),
n
X
S= Xi .
i=1

(i) Give the probability density function, fi , of Xi and calculate P(Xi > y).
(
θi e−θi x if x ≥ 0
fi (x) =
0 otherwise
R∞
Hence P(Xi > y) = y
θi e−θi x dx = e−θi y .
[4 marks]

(ii) Find P(Y > y) and hence show that Y is an exponential random variable.
Tn
P(Y > y) = P(
P i=1
(Xi > y)). Since the Xi are independent this is
−( n
Qn −θi y Pn
e = e i=1 θi )y . Hence Y ∼Exp(θ) where θ = θ .
i=1 i=1 i
[4 marks]

(iii) Find E(Y ), Var(Y ) and the moment generating function of Y [Recall that
the moment generating function, M , of Y is given by M (t) = E (etY )].
R∞ R∞
E(Y ) = R0 θye−θy dy = 0R e−θy dy = 1θ .
∞ ∞
E(Y 2 ) = 0 θy 2 e−θy dy = 0 2ye−θy dy = 2
θ2
. Hence Var(Y ) = 1
θ2
.
(

∞ if t ≥ θ
Z
M (t) = θe−θy ety dy = θ
0 θ−t
if t < θ.
[4 marks]

(iv) Calculate E(S).

Question 2 continued overleaf

Page 4 of 8
ST1190
Question 2 continued

Pn Pn Pn 1
E(S) = E i=1 Xi = i=1 E(Xi ) = i=1 θi .
[2 marks]

(v) Suppose that θ1 = . . . = θn = λ. State the distribution of S.

S is a Γ(n, λ) random variable.


[2 marks]

[16 marks]
[Total Marks 20]

3. (a) Define the covariance, Cov(X, Y ), and the correlation coefficient, ρ(X, Y ),
between two random variables X and Y .
[5 marks]

Cov(X, Y ) = E(X − E(X))(Y − E(Y )) = E(XY ) − E(X)E(Y ) and then

Cov(X, Y )
ρ = ρ(X, Y ) = p p .
Var(X) Var(Y )

(b) Prove the Cauchy-Schwarz inequality:


p
Cov(X, Y ) ≤ Var(X) · Var(Y ),

and deduce that |ρ| ≤ 1.


[5 marks]

Now, define
φ(t) := Var(X − tY ), t ∈ R.
We have

φ(t) = Cov(X − tY, X − tY ) = Cov(X, X) − 2tCov(X, Y ) + t2 Cov(Y, Y )


= Var(X) − 2tCov(X, Y ) + t2 Var(Y ).

Note that φ(t) ≥ 0 for all t, since the variance is always non-negative. This is a
quadratic function in t, so the discriminant, (−2Cov(X, Y ))2 − 4Var(Y )Var(X)
must be less than or equal to zero. This gives the desired inequality.

(c) The random variables X1 , . . . , Xn are identically distributed ( but not independent)
with commonP mean 0 and variance 1. For i ̸= j, Cov(Xi , Xj ) = ρ. The sum
of the Xi , ni=1 Xi , is denoted Sn . Give an expression for Var(Sn ) and deduce
1
(using the non-negativity of variance) that ρ ≥ − n−1 .

Question 3 continued overleaf

Page 5 of 8
ST1190
Question 3 continued

[5 marks]
Pn P
vn = Var(Sn ) = i=1 Var(Xi ) + 2 1≤i<j≤n Cov(Xi , Xj ) = n + n(n − 1)ρ. Now
vn ≥ 0 so
1
n(n − 1)ρ ≥ −n ⇒ ρ ≥ − .
n−1

(d) X and Y are continuous random variables. X is Exp(1). Y has conditional


density (
xe−xy for y > 0
fY |X (y|x) =
0 for y ≤ 0.
What is the joint density function for X and Y , fX,Y ? Deduce the probability
density function for Y , fY .
[5 marks]

fX,Y (x,y)
Since fY |X (y|x) = fX (x)
, and fX (x) = e−x for positive x,

fX,Y (x, y) = xe−x(1+y) for x, y > 0 and 0 otherwise.

It follows that
Z ∞ Z ∞
fY (y) = fX,Y (x, y)dx = xe−x(1+y) dx
−∞ 0
∞ Z ∞ −x(1+y)  −x(1+y) ∞
e−x(1+y)

e e 1
= −x + dx = − = .
1 + y x=0 0 1+y (1 + y) x=0 (y + 1)2
2

[Total Marks 20]

Question 4 starts on next page

Page 6 of 8
ST1190

4. (a) Let Y be a discrete random variable taking values in the non-negative integers
Z+ .
(i) Prove Markov’s Inequality: for any k > 0
E(Y )
P(Y ≥ k) ≤ .
k

Let f (y) be the probability mass function of Y . Then P(Y ≥ k) =


P∞ P∞ y 1
P∞ E[Y ]
y=k f (y) ≤ y=k k f (y) ≤ k y=0 yf (y) = k .
[5 marks]

(ii) Suppose P(Y = 0) < 1 and k is a positive integer. Show that


E(Y )
P(Y ≥ k) = if and only if P(Y = k) = 1 − P(Y = 0).
k

= f (k) + ∞
E[Y ] P y Pk−1 y y
k y=k+1 k f (y) + y=0 k f (y). Note that k f (y) > f (y) for
y > k and y=0 k f (y) > 0. Thus P(Y ≥ k) = f (k) + ∞
Pk−1 y P
y=k+1 f (y) =
Pk−1 y P∞ y
f (k) + y=0 k f (y) + y=k+1 k f (y) if and only if f (y) = 0 for y > k and
]
1 ≤ y < k. It follows that P(Y ≥ k) = E[Y k
if and only if f (k) = 1 − f (0).
[5 marks]

[10 marks]
(b) Suppose X1 , ..., Xn are independent and identically distributed each with mean
2
µ and variance σX = 16. Moreover, Y1 , . . . , Yn are independent and identically
distributed and independent of X1 , ...,P
Xn , also with mean µ but with variance
σY2 = 9. Let Ui = Xi − Yi and Zn = n1 ni=1 Ui .
(i) Determine E(Ui ) and Var(Ui ).

E(Ui ) = E(Xi ) − E(Yi ) = 0.


[2 marks]
Var(Ui ) = Var(Xi ) + Var(Yi ) = 25.
[3 marks]

(ii) Quote Chebychev’s inequality and use it to prove P(|Ui | < 10) ≥ 0.75.

Let µ = E(X). Then Chebychev’s inequality states P(|X − µ| ≥ ϵ) ≤


Var(X)/ϵ2 .
[2 marks]
P(|Ui | ≥ 10) ≤ Var(Ui )/100 = 0.25 and so P(|Ui | < 10) = 1 − P(|Ui | ≥
10) ≥ 0.75.
[3 marks]

(iii) Determine E(Zn ) and Var(Zn ). Use Chebychev’s inequality to give an


upper bound for N , the minimal sample size n for which P(|Zn | ≥ 0.5) ≤
0.01.

Question 4 continued overleaf

Page 7 of 8
ST1190
Question 4 continued

1
Pn 1
Pn 25
E(Zn ) = n i=1 E(Ui ) = 0 and Var(Zn ) = n2 i=1 Var(Ui ) = n
.
[2 marks]
25 100
P(|Zn | ≥ 0.5) ≤ n
×4= n
≤ 0.01 if n ≥ 10, 000, so N ≤ 10, 000.
[3 marks]

(iv) State the Central Limit Theorem for U1 , . . . and use it to approximate N
(defined in part ((iii) above). You may use the fact that Φ(1.64) = 0.95,
Φ(2.33) = 0.99, and Φ(2.58) = 0.995, where Φ is the distribution function
of a standard Normal random variable.
Pn √
Ui nZn
√ i=1
= 5
is asymptotically Normally distributed with mean 0 and
nVar(U1 )
√ √
variance

1. P(|Zn | ≥ 0.5) ≈ 2[1 − Φ( 0.55 n )] ≤ 0.01 iff Φ( 0.55 n ) ≥ 0.995 iff
0.5 5n ≥ 2.58 iff n ≥ 666, so N ≈ 666.
[5 marks]

[20 marks]
[Total Marks 30]

End

Page 8 of 8

You might also like