COMPLEX ANALYSIS,PROBABILITY AND STATISTICAL METHODS(18MAT41)
MODULE - 05
JOINT PROBABILITY DISTRIBUTION
INTRODUCTION
We have discussed probability distribution associated with a single random
variable. The same can be generalized for two or more random variables. We
discuss probability distributions associated with two random variables referred
to as a joint distribution.
JOINT DISTRIBUTION AND JOINT PROBABILITY DISTRIBUTION
If X & Y are two discrete random variables, we define the joint probability
function of X & Y by
P (X = x, Y = y) = f (x,y)
Where f(x,y) satisfy conditions
f(x,y) ≥0 and ∑𝑥 ∑𝑦 𝑓(𝑥, 𝑦) = 1
The second condition means that the sum over all the values of x and y is equal
to one.
Suppose X = {𝑥1 , 𝑥2 , … 𝑥𝑚 } and Y = {𝑦1 , 𝑦2 , … 𝑦𝑛 } then P ( X = xi , Y = yi)
denoted by J ij.
It should be observed that f is a function on the Cartesian product of the sets X
and Y as we have
X × Y = {(𝑥1, 𝑦1 ), (𝑥2 , 𝑦2 ) … (𝑥𝑚 , 𝑦𝑛 )}
f is also referred to as joint probability density function of X and Y in the
respective order. The set of values of this function f(xi, yi) = J ij for i = 1,2,...m, j
= 1,2,...n is called the joint probability distribution of X and Y.These values are
presented in the form of a two way table called the joint probability table.
DEPT.OF MATHS Page 1
COMPLEX ANALYSIS,PROBABILITY AND STATISTICAL METHODS(18MAT41)
X Y 𝑦1 𝑦2 ... 𝑦𝑛 Sum
𝑥1 𝐽11 𝐽12 𝐽1𝑛 f(x1)
𝑥2 𝐽21 𝐽22 ... 𝐽2𝑛 f(x2)
... ... ... ... ...
𝑥𝑚 𝐽𝑚1 𝐽𝑚2 ... 𝐽𝑚𝑛 f(xm)
𝑠𝑢𝑚 g(y1) g(y2) g(yn) 1
MARGINAL PROBABILITY DISTRIBUTION
In the joint probability table {f(x1), f(x2). . . f(xm) }are the sum of horizontal
entries and {g(y1), g(y2), . . . g(yn)} are the sum of vertical entries in the joint
probability distribution table. These are called marginal probability distribution
of X and Y respectively.
INDEPENDENT RANDOM VARIABLES
The discrete random variable X and Y are said to be independent random
variables if P ( X = xi , Y = yj) = P ( X = xi ) ∙ P(Y = yi)
i.e f(xi) g(yj) = Jij
Expectation, Variance, Covariance and Correlation
Expectation
𝜇𝑋 = E(X) = ∑𝑥 ∑𝑦 𝑥 𝑓(𝑥, 𝑦) = ∑𝑖 𝑥𝑖 𝑓(𝑥𝑖 )
𝜇𝑌 = E(Y) = ∑𝑥 ∑𝑦 𝑦 𝑓(𝑥, 𝑦) = ∑𝑗 𝑦𝑗 𝑔(𝑦𝑗 )
𝜇𝑋𝑌 = E(XY) = ∑𝑖 ∑𝑗 𝑥𝑖 𝑦𝑗 𝐽𝑖𝑗
Variance
𝜎𝑋2 = E ( X2) – [𝐸(𝑋)]2
𝜎𝑌2 = E ( Y2) – [𝐸(𝑌)]2
Covariance
COV (X,Y) = E (XY) – E(X)∙ E(Y)
Correlation
DEPT.OF MATHS Page 2
COMPLEX ANALYSIS,PROBABILITY AND STATISTICAL METHODS(18MAT41)
𝐶𝑂𝑉 (𝑋,𝑌)
Correlation of X and Y = 𝜌(𝑋, 𝑌) =
𝜎𝑋 𝜎𝑌
NOTE:
If X and Y are independents, E(X , Y) = E(X)∙ E(Y) and hence
COV (X , Y) = 0 = 𝜌(𝑋, 𝑌)
PROBLEMS
1. The joint probability distribution of two random variables X and Y ia as
follows.
X Y -4 2 7
1 1/8 1//4 1/8
5 1/4 1/8 1/8
Compute the following
(a) E(X) and E(Y) (b) E( XY) ( c) 𝜎𝑋 and 𝜎𝑦 (d) COV (X , Y)
( e) 𝜌(𝑋, 𝑌)
Solu: The distribution is obtained adding the all the respective row entries
and also the respective coloumn entries.
Distribution of X : Distribution of Y :
xi 1 5 yj -4 2 7
f(xi) 1/2 1/2 g(yj) 3/8 3/8 1/4
a) E(X) = ∑ 𝑥𝑖 𝑓(𝑥𝑖 ) = (1)(1/2) + 5 (1/2) = 3 = 𝜇𝑥
E(Y) = ∑ 𝑦𝑗 𝑔(𝑦𝑗 ) = (-4)(3/8) + 2( 3/8) + 7(1/4) = 1 = 𝜇𝑦
b) E (XY) = ∑ 𝑥𝑖 𝑦𝑗 𝐽𝑖𝑗 = (1)(-4)(1/8) + (1) (2) ( 1/4) + (1) (7) ( 1/8)
+(5) (-4) ( 1/4)+ (5) (2) ( 1/8)+ (5) (7) ( 1/8)
= 3/2
c) 𝜎𝑋2 = E ( X2) – [𝐸(𝑋)]2 and 𝜎𝑌2 = E ( Y2) – [𝐸(𝑌)]2
Now E ( X2) = ∑ 𝑥 2 𝑓(𝑥𝑖 ) = (1)(1/2)+25(1/2) = 13
E ( Y2) = ∑ 𝑦 2 𝑔(𝑦𝑗 ) = (16)(3/8)+(4)(3/8) + (48)(1/4) = 79/4
DEPT.OF MATHS Page 3
COMPLEX ANALYSIS,PROBABILITY AND STATISTICAL METHODS(18MAT41)
Hence 𝜎𝑋2 = 13 – (3)2 = 4 and 𝜎𝑌2 = (79/4) – (1)2= 75/4
75
Thus 𝜎𝑋 = 2 and 𝜎𝑦 = √( ) = 4.33
4
d) COV (X,Y) = E (XY) – E(X)∙ E(Y)
= (3/2) – 3 (1) = - 3/2
3
𝐶𝑂𝑉 (𝑋,𝑌) (− )
2
e) 𝜌(𝑋, 𝑌) = = = - 0. 1732.
𝜎𝑋 𝜎𝑌 75
(2)√( )
4
2. The joint probability distribution table for two random variables X and Y is
as follows.
X Y -2 -1 4 5
1 0.1 0.2 0 0.3
2 0.2 0.1 0.`1 0
Determine the marginal probability distributions of X and Y. Also compute
(a) Expectations of X , Y and XY
(b) S.D’s of X,Y
(c) covariance of X and Y (d) Correlation of X and Y
Further verify that X and Y are dependent random variables
Solu: Marginal distributions of X and Y are got by adding all the respective
row entries and the respective column entries.
xi 1 2 yj -2 -1 4 5
f(xi) 0.6 0.4 g(yj) 0.3 0.3 0.1 0.3
(a)
= E(X) = ∑ 𝑥𝑖 𝑓(𝑥𝑖 ) = (1)(0.6) + (2)(0.4) = 1.4
𝜇𝑦 = E(Y) = ∑ 𝑦𝑗 𝑔(𝑦𝑗 ) = (-2) (0.3) + (-1)(0.3) + 4 (0.1) + 5 (0.3) = 1
E (XY) = ∑ 𝑥𝑖 𝑦𝑗 𝐽𝑖𝑗 = (1)(-2)(0.1) + (1) (-1) (0.2) + (1) (4) (0) + (1) (5) (0.3)
DEPT.OF MATHS Page 4
COMPLEX ANALYSIS,PROBABILITY AND STATISTICAL METHODS(18MAT41)
+ (2) (-2) (0.2) + (2) (-1) ( 0.1) + (2) (4) (0.1) + (2)(5) (0)
= 0.9
b) 𝜎𝑋2 = E ( X2) – [𝐸(𝑋)]2 and 𝜎𝑌2 = E ( Y2) – [𝐸(𝑌)]2
Now E ( X2) = ∑ 𝑥 2 𝑓(𝑥𝑖 ) = (1)(0.6) + (4) (0.4) = 2.2
E ( Y2) = ∑ 𝑦 2 𝑔(𝑦𝑗 ) = (4)(0.3) + 1(0.3) + 16(0.1) + 25(0.3) = 10.6
Hence 𝜎𝑋2 = 2.2 – (1.4)2 = 0.245 and 𝜎𝑌2 = (10.6) – (1)2= 9.6
Thus 𝜎𝑋 = 0,49 and 𝜎𝑦 = 3.1
c) COV (X,Y) = E (XY) – E(X)∙ E(Y)
= 0.9 – 1.4(1) = - 0.5
𝐶𝑂𝑉 (𝑋,𝑌) (−0.5)
e) 𝜌(𝑋, 𝑌) = = (0.49)(3.1) = - 0.3.
𝜎𝑋 𝜎𝑌
If X and Yare independent random variables we must have
f(xi) g(yj) = Jij
It can be seen that f(x1)g(y1) = (0.6)(0.3) = 0.18 and J11 = 0.1
i.e f(x1)g(y1) ≠ J11
Hence we conclude that X and Y are dependent random variables.
3.The joint probability distribution of two discrete random variables X and Y is
given by f(x,y)= k(2x + y) where x and y are integers such that 0 ≤ x ≤ 2 , 0 ≤
y≤3.
(a) Find the value of the constant k
(b) Find the marginal probability distributions of X and Y
(C) Show that the random variable X and Y are dependent.
Solu: X = {𝑥𝑖 } = {0,1,2} and Y = {𝑦𝑗 } = {0,1,2,3}
f( x,y) = k(2x+y) and the joint probability distribution table is
formed as follows.
DEPT.OF MATHS Page 5
COMPLEX ANALYSIS,PROBABILITY AND STATISTICAL METHODS(18MAT41)
X 0 1 2 3 Sum
Y
0 0 k 2k 3k 6k
1 2k 3k 4k 5k 14k
2 4k 5k 6k 7k 22
Sum 6k 9k 12k 15k 42k
a) We must have 42k = 1
∴ k = 1/42
b) Marginal probability distribution is as follows.
xi 0 1 2 yj 0 1 2 3
f(xi) 6/42 4/42 22/42 g(yj) 6/42 9/42 12/4 15/42
= 1/7 =1/3 =11/21 =1/7 =3/14 =2/7 =5/14
c) It can be easily seen that f(xi) g(yj) ≠ Jij
Hence the random variables ade dependent.
4. A fair coin is tossed thrice. The random variables X and Y are defined as
follows. X = 0 or 1 according as head or tail occurs on the first toss.
Y = Number of heads
(a) Determine the distribution of X and Y
(b)Determine the joint distribution of X and Y
(C) Obtain the expectations of X,Y and XY. Also find S.Ds of
X and Y
(d) Compute Covariance and Correlation of X and Y.
Solu. The sample space S and the association of random variables X
and Y is given by the following table
DEPT.OF MATHS Page 6
COMPLEX ANALYSIS,PROBABILITY AND STATISTICAL METHODS(18MAT41)
S HHH HHT HTH HTT THH THT TTH TTT
X 0 0 0 0 1 1 1 1
Y 3 2 2 1 2 1 1 0
(a) The probability distribution of X and Y is found as follows.
X = {𝑥𝑖 } = {0,1} and Y = {𝑦𝑗 } = {0,1,2,3}
P(X=0) is 4/8 = 1/2, P( X = 1) is 4/8 = 1/2
P(Y=0) is 1/8 , P( Y = 1) is 3/8
P(Y=2) is 3/8 , P( Y = 3) is 1/8
Thus we have the following probability distribution of X and Y
xi 0 `1 yj 0 1 2 3
f(xi) 1/2 1/2 g(yj) 1/8 3/8 3/8 1/8
(b)The joint distribution of X and Y is found by computing
Jij = P( X = xi , Y = yj ) where we have
X1 = 0, X2 = 1 and y1 = 0, y2 = 1 , y3 = 2 , y4 = 3
J 11 = P ( X = 0, Y = 0) = 0
(X = 0 implies that their is a head turn out and Y the total
number of heads 0 is impossible)
J 12 = P ( X = 0, Y = 1) = 1/8 correspondings to the outcome HTT
J 13 = P ( X = 0, Y = 2) = 2/8=1/4; outcomes are HHT and HTH
J 14 = P ( X = 0, Y = 3) = 1/8;outcome is HHH
J 21 = P ( X = 1, Y = 0) = 1/8,outcome is TTT
J 22 = P ( X = 1, Y = 1) = 2/8=1/4; outcomes are THT ,TTH
J 23 = P ( X = 1, Y = 2) = 1/8,outcome is THH
DEPT.OF MATHS Page 7
COMPLEX ANALYSIS,PROBABILITY AND STATISTICAL METHODS(18MAT41)
J 24 = P ( X = 1, Y = 3) = 0 since the outcome is impossible.
(These values can be written quickly by looking at the table of S ,
X,Y)
The required joint probability distribution of X and Y is as follows.
X 0 1 2 3 Sum
Y
0 0 1/8 1/4 1/8 1/2
1 1/8 1/4 1/8 0 1/2
Sum 1/8 3/8 3/8 1/8 1
(c) 𝜇𝑥 = E(X) = ∑ 𝑥𝑖 𝑓(𝑥𝑖 ) = (0) (1/2) + (1)(1/2) = 1/2
𝜇𝑦 = E(Y) = ∑ 𝑦𝑗 𝑔(𝑦𝑗 ) = (0) (1/8) + (1)(3/8) + 2 (3/8) + 3 (1/8) = 12/8 =3/2
E (XY) = ∑ 𝑥𝑖 𝑦𝑗 𝐽𝑖𝑗 = 0 + (0+ ¼+ 2/8 + 0) = 1/2
𝜎𝑋2 = E ( X2) – [𝐸(𝑋)]2 and 𝜎𝑌2 = E ( Y2) – [𝐸(𝑌)]2
𝜎𝑋2 = (0 + ½ ) – ¼ = ¼ 𝜎𝑌2 = (0+3/8 + 3/2 + 9/8) – ( 9/4) = 3- (9/4) = 3/4
Thus 𝜎𝑋 = 1/2 and 𝜎𝑦 = √3/2
c) COV (X,Y) = E (XY) – E(X)∙ E(Y)
= ½ - ¾ = - 1/4
𝐶𝑂𝑉 (𝑋,𝑌) (−1/4) 1
𝜌(𝑋, 𝑌) = = =-
𝜎𝑋 𝜎𝑌 √3/4 √3
DEPT.OF MATHS Page 8
COMPLEX ANALYSIS,PROBABILITY AND STATISTICAL METHODS(18MAT41)
SAMPLING THEORY
INTRODUCTION
DEPT.OF MATHS Page 9
COMPLEX ANALYSIS,PROBABILITY AND STATISTICAL METHODS(18MAT41)
DEPT.OF MATHS Page 10
COMPLEX ANALYSIS,PROBABILITY AND STATISTICAL METHODS(18MAT41)
DEPT.OF MATHS Page 11
COMPLEX ANALYSIS,PROBABILITY AND STATISTICAL METHODS(18MAT41)
DEPT.OF MATHS Page 12
COMPLEX ANALYSIS,PROBABILITY AND STATISTICAL METHODS(18MAT41)
DEPT.OF MATHS Page 13
COMPLEX ANALYSIS,PROBABILITY AND STATISTICAL METHODS(18MAT41)
Test for significance for large samples
DEPT.OF MATHS Page 14
COMPLEX ANALYSIS,PROBABILITY AND STATISTICAL METHODS(18MAT41)
DEPT.OF MATHS Page 15
COMPLEX ANALYSIS,PROBABILITY AND STATISTICAL METHODS(18MAT41)
DEPT.OF MATHS Page 16
COMPLEX ANALYSIS,PROBABILITY AND STATISTICAL METHODS(18MAT41)
DEPT.OF MATHS Page 17
COMPLEX ANALYSIS,PROBABILITY AND STATISTICAL METHODS(18MAT41)
DEPT.OF MATHS Page 18
COMPLEX ANALYSIS,PROBABILITY AND STATISTICAL METHODS(18MAT41)
DEPT.OF MATHS Page 19
COMPLEX ANALYSIS,PROBABILITY AND STATISTICAL METHODS(18MAT41)
DEPT.OF MATHS Page 20
COMPLEX ANALYSIS,PROBABILITY AND STATISTICAL METHODS(18MAT41)
DEPT.OF MATHS Page 21
COMPLEX ANALYSIS,PROBABILITY AND STATISTICAL METHODS(18MAT41)
DEPT.OF MATHS Page 22
COMPLEX ANALYSIS,PROBABILITY AND STATISTICAL METHODS(18MAT41)
DEPT.OF MATHS Page 23
COMPLEX ANALYSIS,PROBABILITY AND STATISTICAL METHODS(18MAT41)
DEPT.OF MATHS Page 24
COMPLEX ANALYSIS,PROBABILITY AND STATISTICAL METHODS(18MAT41)
DEPT.OF MATHS Page 25