0% found this document useful (0 votes)
108 views7 pages

More Than One Variable: 6.1 Bivariate Discrete Distributions

1. The document defines joint, marginal, and conditional probability distributions for discrete random variables X and Y. 2. It introduces concepts like covariance, correlation, independence, and uses examples to show how to calculate these measures from a joint probability distribution. 3. The example calculates the joint probability distribution for random variables Y1 and Y2 defined in terms of throws of a fair dice, then derives related metrics like means, variances, and conditional distributions.

Uploaded by

Izham Shukeri
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
108 views7 pages

More Than One Variable: 6.1 Bivariate Discrete Distributions

1. The document defines joint, marginal, and conditional probability distributions for discrete random variables X and Y. 2. It introduces concepts like covariance, correlation, independence, and uses examples to show how to calculate these measures from a joint probability distribution. 3. The example calculates the joint probability distribution for random variables Y1 and Y2 defined in terms of throws of a fair dice, then derives related metrics like means, variances, and conditional distributions.

Uploaded by

Izham Shukeri
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

Chapter 6

More than one variable

6.1 Bivariate discrete distributions


Suppose that the r.v.s X and Y are discrete and take on the values xj and yj , j 1,
respectively. Then the joint p.d.f. of X and Y , to be denoted by fX,Y , is defined
by: fX,Y (xj , yj ) = P (X = xj , Y = yj ) and fX,Y (x, y) = 0 when (x, y) 6= (xj , yj )
(i.e., at least one of x or y is not equal to xj or yj , respectively).
The marginal distribution of X is defined by the probability function
X
P (X = xi ) = P (X = xi , Y = yj ).
j

X
P (Y = yj ) = P (X = xi , Y = yj ).
i
P
Note that P (X = xi ) 0 and i P (X = xi ) = 1. The mean and variance of
X can be defined in the usual way.
The conditional distribution of X given Y = yj is defined by the probability
function
P (X = xi , Y = yj )
P (X = xi |Y = yj ) = .
P (Y = yj )
The conditional mean of X given Y = yj is defined by
X
E[X|Y = yj ] = xi P (X|Y = yj ).
i

and similarly for the variance:

V ar[X|Y = yj ] = E(X 2 |Y = yj ) (E(X|Y = yj ))2 .


Although the E[X|Y = yj ] depends on the particular values of Y , it turns out
that its average does not, and, indeed, is the same as the E[X]. More precisely, it
holds:
E[E(X|Y )] = E[X] and E[E(Y |X)] = E[Y ].

1
CHAPTER 6. MORE THAN ONE VARIABLE 2

That is, the expectation of the conditional expectation of X is equal to its


expectation, and likewise for Y.
The covariance of X and Y is defined by

cov[X, Y ] = E[(X E[X])(Y E[Y ])] = E[XY ] E[X]E[Y ]

where XX
E[XY ] = xi yj P (X = xi , Y = yj ).
i j

The result obtained next provides the range of values of the covariance of two
r.v.s; it is also referred to as a version of the CauchySchwarz inequality.

Theorem 6.1 Cauchy Schwarz inequality

1. Consider the r.v.s X and Y with E[X] = E[Y ] = 0 and V ar[X] = V ar[Y ] =
1. Then always 1 E[XY ] 1, and E[XY ] = 1 if and only if P (X =
Y ) = 1, and E[XY ] = 1 if and only if P (X = Y ) = 1.
2
2. For any r.v.s X and Y with finite expectations and positive variances X and
2
Y , it always holds: X Y Cov(X, Y ) X Y , and Cov(X, Y ) = X Y
if and only if P [Y = E[Y ] + XY (X E[X])] = 1, Cov(X, Y ) = X Y if and
only if P [Y = E[Y ] XY (X EX)] = 1.

The correlation coefficient between X and Y is defined by


  
corr[X, Y ] = E[ XE[X]
X
Y E[Y ]
Y
] = Cov[X,Y
X Y
]
= E[XY ]E[X]E[Y
X Y
]
.

The correlation always lies between 1 and +1.

Example 6.1

Let X and Y be two r.v.s with finite expectations and equal (finite) variances,
and set U = X + Y and V = X Y . Calculate if r.v.s U and V are correlated.

Solution

E[U V ] = E[(X + Y )(X Y )] = E(X 2 Y 2 ) = E[X 2 ] E[Y 2 ]


E[U ]E[V ] = [E(X + Y )][E(X Y )] = (E[X] + E[Y ])(E[X] E[Y ]) = (E[X])2
(E[Y ])2
Cov(U, V ) = E[U V ] E[U ]E[V ] = (E[X 2 ] E[X]2 ) (E[Y 2 ] E[Y ]2 )
= V ar(X) V ar(Y ) = 0
U and V are uncorrelated.

For two r.v.s X and Y with finite expectations, and (positive) standard devia-
tions X and Y , it holds:
CHAPTER 6. MORE THAN ONE VARIABLE 3

2
V ar(X + Y ) = X + Y2 + 2Cov(X, Y )
and
2
V ar(X + Y ) = X + Y2

if X and Y are uncorrelated.


Proof

V ar(X + Y ) = E[(X + Y ) E(X + Y )]2 = E[(X E[X]) + E(Y E[Y ])]2


= E(X E[X])2 + E(Y E[Y ])2 + 2E[(X E[X])(Y E[Y ])]
2
= X + Y2 + 2Cov(X, Y ).

Random variables X and Y are said to be independent if

P (X = xi , Y = yj ) = P (X = xi )P (Y = yj )

.
If X and Y are independent then Cov[X, Y ] = 0. The converse is NOT true.
There exist many pairs of random variables with Cov[X, Y ] = 0 which are not
independent.

Example 6.2

A fair dice is thrown three times. The result of first throw is scored as X1 = 1
if the dice shows 5 or 6 and X1 = 0 otherwise; X2 and X3 are scored likewise for
the second and third throws.
Let Y1 = X1 + X2 and Y2 = X1 X3 .
4
Show that P (Y1 = 0, Y2 = 1) = 27 . Calculate the remaining probabilities in
the bivariate distribution of the pair (Y1 , Y2 ) and display the joint probabilities in
an appropriate table.

1. Find the marginal probability distributions of Y1 and Y2 .

2. Calculate the means and variances of Y1 and Y2 .

3. Calculate the covariance of Y1 and Y2 .

4. Find the conditional distribution of Y1 given Y2 = 0.

5. Find the conditional mean of Y1 given Y2 = 0.


CHAPTER 6. MORE THAN ONE VARIABLE 4

Solution

P (X1 = 1) = P ({5, 6}) = 13 , P (X2 = 1) = 31 , P (X3 ) = 13


For Y1 to be 0, X1 and X2 must be 0. Then Y2 to be 1, X3 must be 1.
P (Y1 = 0, Y2 = 1) = P (X1 = 0)P (X2 = 0)P (X3 = 1) = 23 23 13 = 27
4

Y1 P
0 1 2
4 2 6
-1 27 27
0 27
8 6 1 15
Y2 0 27 27 27 27
4 2 6
1 0 27 27 27
12 12 3
P
27 27 27
1

1. Marginal probability distribution of Y1 :

y1 0 1 2
12 12 3
P (Y1 = y1 ) 27 27 27
Marginal probability distribution of Y2 :

y2 -1 0 -1
6 15 6
P (Y2 = y2 ) 27 27 27

2.

12 12 3 2
E[Y1 ] = 0 +1 +2 =
27 27 27 3
12 12 3 8
E[Y12 ] = 02 + 12 + 22 =
27 27 27 9
 2
8 2 4
V ar[Y1 ] = E[Y12 ] (E[Y1 ])2 = =
9 3 9
6 15 6
E[Y2 ] = 1 +0 +1 =0
27 27 27
6 15 6 4
E[Y22 ] = (1)2 + 02 + 12 =
27 27 27 9
2 2 4
V ar[Y2 ] = E[Y2 ] (E[Y2 ]) =
9

2 4
3. Cov[Y1 , Y2 ] = E[Y1 Y2 ] E[Y1 ]E[Y2 ] = E[Y1 Y2 ] = 1 (1) 27 + 1 1 27 +
2 2
2 1 27 = 9
CHAPTER 6. MORE THAN ONE VARIABLE 5

4.

8
P (Y1 = 0 Y2 = 0) 27 8
P (Y1 = 0|Y2 = 0) = = 15 = ,
P (Y2 = 0) 27
15
6
P (Y1 = 1 Y2 = 0) 27 6
P (Y1 = 1|Y2 = 0) = = 15 =
P (Y2 = 0) 27
15
1
P (Y1 = 2 Y2 = 0) 27 1
P (Y1 = 2|Y2 = 0) = = 15 =
P (Y2 = 0) 27
15

5.

6 1 8
E[Y1 |Y2 = 0] = 1 +2 =
15 15 15

Exercises

Exercise 6.1

The random variables X and Y have a joint probability function given by

c(x2 y + x) x=-2,-1,0,1,2 y=1,2,3



f (x, y) =
0 otherwise

Determine the value of c.


Find P (X > 0) and P (X + Y = 0)
Find the marginal distributions of X and Y .
Find E[X] and V ar[X].
Find E[Y ] and V ar[Y ].
Find the conditional distribution of X given Y = 1 and E[X|Y = 1].
Find the probability function for Z = X + Y and show that E[Z] = E[X] + E[Y ]
Find Cov[X, Y ] and show that V ar[Z] = V ar[X] + V ar[Y ] + 2Cov[X, Y ].
Find the correlation between X and Y .
Are X and Y independent?

Solution

Table for the joined probabilities:


CHAPTER 6. MORE THAN ONE VARIABLE 6

X
-2 -1 0 1 2
1 2c 0 0 2c 6c 10c
Y 2 6c c 0 3c 10c 20c
3 10c 2c 0 4c 14c 30c
18c 3c 0 9c 30c 60c
1
Since the sum of probabilities must add to one, c =
60
.
39
P (X > 0) = 60
1
P (X + Y = 0) = P (X = 2, Y = 2) + P (X = 1, Y = 1) = 10

Marginal distributions for X:

x -2 -1 0 1 2
P (X = x) 18/60 3/60 0 9/60 30/60
Marginal distributions for Y :

y 1 2 3
P (Y = y) 10/60 20/60 30/60
E[X] = 2 18/60 1 3/60 + 0 0 + 1 9/60 + 2 30/60 = 30/60 = 1/2
E[X 2 ] = (2)2 18/60 + (1)2 3/60 + 0 0 + 12 9/60 + 22 30/60 = 3.4
V ar[X] = E[X 2 ] E[X]2 = 3.4 0.52 = 3.15
E[Y ] = 1 1/6 + 2 1/3 + 3 1/2 = 14/6 = 7/3
E[Y 2 ] = 12 1/6 + 22 1/3 + 32 1/2 = 36/6 = 6.0
V ar[Y ] = 6.0 (7/3)2 = 5/9

P (X = 2|Y = 1) = 0.2, P (X = 1|Y = 1) = 0, P (X = 0|Y = 1) = 0,


P (X = 1|Y = 1) = 0.2, P (X = 2|Y = 1) = 0.6
E[X|Y = 1] = 2 0.2 1 0 + 0 0 + 1 0.2 + 2 0.6 = 1

Z =X +Y

z -1 0 1 2 3 4 5
P (Z = z) 2/60 6/60 11/60 4/60 9/60 14/60 14/60
1
E[Z] = 60 1 2 + 1 11 + 2 4 + 3 9 + 4 14 + 5 14) = 170
60
=
5 1 1
= 2 6 = 2 + 2 3 = E[X] + E[Y ]

E[X, Y ] = 2 1 2/60 2 2 6/60 2 3 10/60 1 2 1/60 1 3 2/60 +


1 1 2/60 + 1 2 3/60 + 1 3 4/60 + 2 1 6/60 + 2 2 10/60 + 2 3 14/60 = 1
Cov[X, Y ] = E[X, Y ] E[X]E[Y ] = 1/6
1
E[Z 2 ] = 60 (1 2 + 1 11 + 4 4 + 9 9 + 16 14 + 25 14) = 684 60
V ar[Z] = 68460
17060
= 3.3722
V ar[X] + V ar[Y ] + 2Cov[X, Y ] = 3.15 + 5/9 2 1/6 = 3.3722
Cov[X,Y ]
corr[X, Y ] = = 1/6 = 0.126
V ar[X]V ar[Y ] 3.155/9
CHAPTER 6. MORE THAN ONE VARIABLE 7

X and Y are independent: P (X = 1, Y = 1) 6= P (X = 1)P (Y = 1).


Exercise 6.2

The following experiment is carried out. Three fair coins are tossed. Any coins
showing heads are removed and the remaining coins are tossed. Let X be the
number of heads on the first toss and Y the number of heads on the second toss.
Note that if X = 3 then Y = 0. Find the joint probability function and marginal
distributions of X and Y .

Solution

We have that P (Y = y, X = x) = P (Y = y|X = x)P (X = x).


Suppose X = 0, this has a probability 0.53 . Then Y |X = 0 has a Binomial dis-
tribution with parameters n = 3 and p = 0.5. Similarly Y |X = 1 has a Binomial
distribution with parameters n = 2 and p = 0.5. In this way we see we can produce
a table of the joint probabilities:

X
0 1 2 3
0 1/64 6/64 12/64 8/64 27/64
Y 1 5/64 12/64 12/64 0 27/64
2 3/64 6/64 0 0 9/64
3 1/64 0 0 0 1/64
1/8 3/8 3/8 1/8 1
Marginal distribution for X:
x 0 1 2 3
P (X = x) 1/8 3/8 3/8 1/8
Marginal distribution for Y :
y 0 1 2 3
P (Y = y) 27/64 27/64 9/64 1/64

You might also like