8-9.
MULTINOMIAL DISTRIBUTION
· d" · · 1· t· 1 of Bino1nial distribtttion
Tl11s 1str1bution .can be regarded as a genera 1sa 101 ·
When there are n1ore than two mutually exclusive outcomes of a tria l, the obser-
vations lead to multin om ial distribution. S~1ppose E1, E2, ... , Ek are k mutually exclu-
sive and exhaustive outcomes of a trial \Vith respective probabilities P1, P2, ... , Pk·
. T~e probability that E1 occurs x1 times, £ 2, occurs x 2 times ... and Ek, occurs xk
tunes 111 ~1 independent observations, is given by p (xv x2, ... , xk) = cp1x, pi-'"2 ... Pkxk,
where I x;= n and c is the number of permutation of the eve11ts EV E2, · · ·, Ek·
1 I
l ,
8-61
SPECIAL DISCRETE PROBABILITY DISTRIBUTIONS I
I'
· c, we have to fmd
T o d etermme · the number o f permu ta t·10ns of n. obJ·ects
·
of which
X1 are of one kind, x 2 of another kind, ... , xk of the kth kind, which is given by :
11 ! I
I
C = Xi ! x'.! ! ... xk !
I
Hence = n! . Pix' p/2 ... Pf-i:', 0 $ X; $ 11
X1 ! X2 ! ... XA !
k k
11!
= k
n p/; , L, X; = 11 ... (8·30)
I
i= 1 i=l
'
n X;!
;=1
which is the required probability function of the multinomial distribution. It is so
called .since (8·30) is the general tem1 in the multinomial expansion :
k
(P1 + P2 + .. · + Pk)'1, L JJ; = l
j =1
Since, total probability is 1, we have
~
.\
P (x) =
X
.11 ,! . ,
L [X1,, ,"\2····.'.1:k. Pf'' Pl 2 •·· Pk'°• ] = (P1 + P2 +···+Pk)"= 1 ··· [S-30al
8-9-1. Moments of Multinomial Distribution.· The n:oment generating function
is given by:
Mx (t) = Mx,, x,, ..., x, Uv 12, ... , t1) = E[exp { J, t,X,}]
Pr' Pi-' 2
1
... Pfrk exp ( Ik
I=]
t; x) ]
= (p1 et• + P2 et2 + ... + Pk etk)" . [On using 8·30 (11)]
... (8·31)
where X = (Xi, X 2, ... , Xk).
J
Now Mx, U1) = Mx (t1, 0, 0, ... , 0) = (p 1et1 + Pi + PJ + ... + Pk)"
= ((1 - P1) + P1 c11 ] 11
⇒ X1 ~ B (n, P1) [By uniqueness theorem of m.g.f]
Similarly, we shall get X; ~ B (11, p;); i = 1, 2, :.. , k.
⇒ E (X;) = n P; and Var (X-) - ,·ztJ (l ) .
· 1 - r ; - IJ · • 1 = 1 2 k
r>2 M ] r, , , , ... ,
=[ ~-----.- , i-:/:. j ; where t = t
<Jf 1 rh; ,.,o ( 1, t2, ... , tk)
= [np; ct, (n - 1) (n l), + ]
+ Pk e1k )" - 2 P· et, _
r 1 ...
=n(n-1) .. , t-o
P, P1 (·. p + p
2 + ... +Pk= 1)
Cov (X;, Xi) - E ( . 1
- X; Xi) - E (X;) E (X1-) = n (11 - 1) p · p. _ n 2 p. p. = _ np. 1.
Cov (X - , J , I , 1'
p (X;, Xi) = II Xi) :: -111,. 1
t I i I
{
J'. JJ 1. }
I /2
crx cr . - '
, x, ✓ np, (1 - P;) II P; (1 - P) - - (1 - p;) (1 - I'}
·
8-62 FUNDAMENTALS OF
MATHEMATICAL STATISTICS
Examp/e 8·66. The trinomial distrib . , X and y is given by
ution of two r. v. 5
.
f X,Y(X, y) = X f y f (n !-x-y ) rnX q.Y (1 - p - q)" <
11 -x-y
for x, y = 0, 1, 2, ... , n and x + y 5n, where Osp, 0 sq and P + q - 1·
(i) Find the marginal distributions of X and Y.
(ii) .F_ind the conditional distribittions of X and Y and obtain
--:-:-: (a) E(Y I X = x), and (b) E(X I Y == y)
(iii) Find the correlation coefficient between X and Y.
Solution. (i) The joint m.g f of X and Y is given by :
n n-x
Mx,Y (ti, t2) = E (/1X + t 2 Y) = L ·L (pe'i)x (qet2 )Y (1 - p - ·q)" - x-y
x=O y=O
= [pet1 + qet2 t (1- p - q)]"
Mx (t1) = M (t 1, 0) = {(1 -p) + pe11 }" ⇒ X,..., B (n, p)
My(t 2) =M( 0,t2)={(1 -q)+ qet 1 }" ⇒ Y-B (n,q )
Obse rve that M (ti, t 2) :;:. M (ti, 0) x M (0, t ) ⇒ X and Y are not indep ende nt.
2
(ii) The condi tiona l distri butio n of X given Y = y is given by :
f ( y ~ y ) = Jxvfy (x,
X I y) _ fx v (x, y)
(y) "Cy q.v (1-q)n - .v [·.· Y - B (n, q)]
= (_E_)x( 1-p- q)n- y-x
(n --y) !
x ! (n -y- x) ! • 1 - q
1- q
-(n-y) (_E_
- 1- q
X
)x( l - 1-
L..)n
q
-x-y ; = 0, 1, ..., n
X
⇒ X I (Y = y) ~ B.{(n - y, pI (l - q)}
E[X I (Y=y )]=(n -y).p /(1-q )
Simil arly, we shall get
f (YI X = x) = fXY (x, y) = f (x, y)
fx(x) "CxJr°(l,....p)"-x [ X
· ·.· ~ B (n, p)]
= ('l ; X) (~ y(l _ --L) n - x -y .
P l-p , y == 0, l, ... , n.
⇒ YI (X= x)~B {n-x ,q/(l -p)}
:. E [ Y I (X = x)] = (n _ x). q I (l _ p}
(iii) Correlation coefficient PxY:
Since X ~ B (n, p), E (X) == np,
Var (X) _
y ~ B (n, q), E (Y) == nq, - np (1 -p)
Var (Y):::: n (1
E (XY) - [
a2 M (t t )]
1, 2 q · - q)
- a1 1 at2 , 1 .. , 2 .. 0 =n(n -l)pq
SPECIAL DISCRETE PROBABILITY DIS
TRIBUTIONS 8-63
Cov (X, Y) = E (XY ) - E (X) E (Y) = n (n - 1) pq 2
- n pq = - npq
Cov (X, Y) 112
-npq _ -[ pq ]
PXY = crxO 'y = ✓ np(1-p)nq(1-q) - (1- p)( l-q
Not e . He re p + q1. :t:-
Example 8-67. If X , X , ... , Xk are k independent Poisson
1 2 variates with parameters A-1,
Ai, ... , )..k respectively, prove that the conditional distribution
:
p (X 1 n X 2 n ... n Xk I X), where X = X +
1 X 2 + ... + Xk is fixed, is multinomial.
Solution. P [X1 n X 2 n ... n Xk I X = n]
= P [X1 = r 1 n X 2 = r 2 :"I ... n Xk ~ rk I X = n]
p (X1 = '1 fl X2 ='2 fl ... fl xk = rk n X = n]
= P (X = n) -·
p (X1 = '1 fl ... fl xk-1 = 'k-1 fl-X k = n - '1....,. '2 ... - 'k-1 ]
=
P (X = n)
P (X1 = r1) P (X2 = r2) •·· P (Xk -1 = 'k-1 ) P
= (Xk =n - '1 - ··· - 'k-1 )
P (X = n)
(·.· X1, X2, •.• , Xk are ind epe nde nt.)
Fur the r, sinc e X;, (i = 1, 2, ... , k) are ind epe
nde nt Poi sso n var iate s wit h par am ete rs
A; res pec tive ly, X = X1 + X2 + ... + Xk is also
a Poi sso n var iate \Vi th par am ete r ).. +A
... + )..k = A. (say). 1 i+
·
He nce P [X1 n X2 n ... n Xk I X = n]
e->..1A.{1 e-Ai-1 "A,'k-1 A.
k-1 C•A .kn-r1-----r•-1
---
= r1! 'k-1
--=- ----- :.:._ __;; ...!_ _
... .....;.....
(n- __ri-__..._-,k
___,:-1)
;~:;_!
e->.. A"
n!
= { r, ! '2 ! ... r, _, ! r,,- . . -,,_,) x{ (~·r . . (\· t· (~r-r, -.. -, _,}
!}
= ,1 ! ,;t. ,, ! p{• p{2 ••• p{•, wher".. _
k
I. r; = n and
k
_I. p; =
k
_I.
k
C"') =.!. _I. A;= 1
1=1 .
Thu s the con diti ona l dis trib utio n P (X
•=1 •=1 A A t=l
wit h pro bab iliti es p; = (A.;/'A.) ; i =1, 2, ... ,ki 1 n x2·n ... n Xk I X = n) is mu ltin om ial
n k clas ses.
Remark. If X; 's are iden tica lly dist ribu ted ind
epe nde nt Poi sson var iate s wit h par ame ter
k m
(say,) then 'A.;= m; i = 1, 2, ... , k and )..= ~ "A,.= km • P· - ~ -1.
:'--' I
I• l
• •• • -A -k
Hen ce in this cas~ the co~ diti ona l dist
X1 + X2 + ... + Xk = n,
ribu tion of X , x , ... , xk, giv en that thei
18 r sum
clas s bein g equ al to 1 / k. a mul tino mia l dist ribu tion wit h ind ex n and the pro bab ility in eac h
1 2