0% found this document useful (0 votes)
9 views20 pages

Article 13

Idk

Uploaded by

sabriyahehar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
9 views20 pages

Article 13

Idk

Uploaded by

sabriyahehar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 20

Lecture 13 : The Exponential Distribution

0/ 19
Definition
A continuous random variable X is said to have exponential distribution with
parameter λ.
If the pdf of X is (with λ > 0)

λe−λx , x > 0
(
f (x ) = (*)
0, otherwise

Remarks
Very often the independent variable will be time t rather than x. The exponential
1
distribution is the special case of the gamma distribution with α = 1 and β = .
α
We will see that X β closely tied to the Poisson process, that is why λ is used
above.

1/ 19

Lecture 13 : The Exponential Distribution


Here is the graph of f

Proposition ((cdf) (Prove this))


If X has exponential distribution then

F (x ) = P (X ≤ x ) = 1 − e−λx

Corollary (Prove this)


If X has exponential distribution

P (X > x ) = e −λx

This is a very useful formula.

2/ 19

Lecture 13 : The Exponential Distribution


Proposition
If X has exponential distribution
1
(i) E (X ) =
λ
1
(ii) V (X ) =
λ2

The Physical Meaning of the Exponential Distribution


Recall (Lecture 8) that the binomial process (having a a child, flipping a coin)
gave rise to two (actually infinitely many) more distributions
1 X = the geometric distribution
= the waiting time for the first girl

3/ 19

Lecture 13 : The Exponential Distribution


and Xr = the negative binomial
= the waiting time for the r-th girl

Remark
Here time was discrete. Also Xr was the number of boys before the r-th girl so
the waiting time was actually Yr = Xr + r − 1.

Now we will see the same thing happens with a Poisson process. Now time is
continuous, as I warned you. I will switch from x to t in (*).

4/ 19

Lecture 13 : The Exponential Distribution


So suppose we have a trap to catch some species of animal. We run it forever
starting at time t = 0, so 0 ≤ t < ∞.

The Counting Random Variable


Now fix a time period t. So we have a “counting random variable Xt ”.
Xt = ] of animals caught in the trap in time t.
We will choose the model Xt ∼ P (λt ) = Poisson with parameter λt.
Z We are using λ instead of α in the Poisson process
N.B. P (Xt = 0) = e −λt (])

5/ 19

Lecture 13 : The Exponential Distribution


Remark
The analogue from before was
Xn = ] of girls in the first n children
(so we have a discrete “time period”, the binomial random variable was the
counting random variable.)
Now we want to consider the analogue of the “waiting time” random variables,
the geometric and negative binomials for the binomial process.
Let Y = the time when the first animal is caught.

6/ 19

Lecture 13 : The Exponential Distribution


The proof of the following theorem involves such a beautiful simple idea I am
going to give it.
Theorem
Y has exponential distribution with parameter α.

Proof
We will compute P (Y > t ) and show P (Y > t ) = e −λt

(so F (t ) = P (Y ≤ t ) = 1 − e −λt
and f (t ) = F 0 (λ) = λe −λt )

7/ 19

Lecture 13 : The Exponential Distribution


Proof (Cont.)
Here is the beautiful observation. You have to wait longer than t units for the first
animal to be caught
⇔ there are no animals in the trap at time t.
In symbols this says

equality of
events

But we have seen

P (Xt = 0) = e −λt so necessarily

P (Y > t ) = e −λt

8/ 19

Lecture 13 : The Exponential Distribution


Now what about the analogue of the negative binomial = the waiting time for the
n-th girl.

The r-Erlang Distribution


Let Yr = the waiting until the r-th animal is caught.
Theorem

(i) The cdf Fr of Yr is given by

(λt )r −1 −λt

(λt )2
 1 − (1 + λ + 2! + · · · + ,e ,t > 0



Fr (t ) = 
 (r − 1)!
 0, otherwise

(ii) Differentiating (this is tricky) Fr (t ) to get the pdf fr (t ) we get

λt
 r r −1
e −λt , t > 0



fr (t ) = 

(r − 1 )!
 0,


otherwise

Remark
This distribution is called the r-Erlang distribution.

9/ 19

Lecture 13 : The Exponential Distribution


Proof
We use the same trick as before

P (Yr > t ) = P (Xt ≤ r − 1)

The waiting time for the r-th animal to arrive in the trap is > t ⇔ at time t
there are ≤ r − 1 animals in the trap.
Since Xt ∼ P (λt ) we have

(λt )2
P (Xt ≤ r − 1) = e −λt + e −λt λt + e −λt
2!
(λt )r −1
+ · · · + e −λt
(r − 1)!

10/ 19

Lecture 13 : The Exponential Distribution


Proof (Cont.)
Now we have to do some hard computation.

(λt )r −1
!
P (Xt ≤ r − 1) = e −λt 1 + λt + · · · +
(r − 1)!

So

Fr (t ) = P (Yr ≤ t ) = 1 − P (Yr > t )


(λt )r −1
!
=1−e −λt
1 + λt + · · · +
(r − 1)!

dFr
But fr (t ) =
(t )
dt
So we have to differentiate the expression on the right-hand side

d
Of course (1) = 0
dt

11/ 19

Lecture 13 : The Exponential Distribution


Proof (Cont.)

A hard derivative computation

(λt )2 (λt )r −2 (λt )r −?


!
−d
fr (t ) = − (e ) 1 + λt +
−λt
+ ··· + +
dt 2! (r − 2!) (r − 1)!
2
(λt )r −1
!
−λt d (λt )
−e 1 + λt + + ··· +
dt 1! (r − 1)?
2
(λt )r −2 (λt )r −1
!
(λt )
= λe −λt
1 + λt + + ··· + +
2! (r − 2)! (r − 1)!
λ3 t 2 λr −1 t r −2
!
−e −λt
λ+λ t +
2
+ ··· +
2! (r − 2)!
 
λ 3
t 2
λ r −1 r −
t 2 λ r r −1 
t
= e λ + λ + +

−λt  2 
+ ··· +   + 
2! (r − 2)! (r − 1)! 
l l l l
 
λ2 t 2 λr −1
t r −
2
− e λ + λ + +

−λt  2 
 + ··· +  
2! (r − 2)! 

λr t r −1 −λt
= e
(r − 1)!


12/ 19

Lecture 13 : The Exponential Distribution


Lifetimes and Failure Times for Components and Systems

Suppose each of the components has a lifetime that is exponentially distributed


with parameter λ (see below for a more precise statement).
Assume the components are independent. How is the system lifetime
distributed?

13/ 19

Lecture 13 : The Exponential Distribution


Solution
Define random variables X1 , X2 , X3 by

(Xi = t ) = (Ci fails at time t ), i = 1, 2, 3

Then Xi is exponentially distributed with parameter λ so

P (Xi ≤ t ) = 1 − e −λt , i = 1, 2, 3
P (Xi > t ) = e −λ , i = 1, 2, 3.

Define Y by
(Y = t ) = (system fails at time t )

14/ 19

Lecture 13 : The Exponential Distribution


Solution (Cont.)
The key step (using the geometry of the system)
Lump C1 and C2 into a single component A and let W be the corresponding
random variable so
(W = t ) = (A fails at time t )

(Y > t ) = (W > t ) ∩ (X3 > t )


(the system is working at time t ⇔ both A and C3 are working at time t)

15/ 19

Lecture 13 : The Exponential Distribution


The Golden Rule
Try to get ∩ instead of ∪ - that’s why I choose (Y > t ) on the left.
Hence

P (Y > t ) = P ((W > t ) ∩ (X3 > t ))


by independence
= P (W > t ) · P (X3 > t ) (])

Why are (W > t ) and (X3 > t ) independent?

Answer
Suppose C1 , C2 , . . . Cn are independent components. Suppose
A = a subcollection of the Ci ’s.
B = another subcollection of the Ci ’s.

16/ 19

Lecture 13 : The Exponential Distribution


Answer (Cont.)
Then A and B are independent ⇔ they have no common component.
So now we need
P (W > t ) where W is

I should switch to P (W ≤ t ) to get intersections but I won’t to show you why


unions give extra terms.

17/ 19

Lecture 13 : The Exponential Distribution


(W > t ) = (X1 > t ) ∪ (X2 > t )
(A is working at time t ⇔ either C1 is or C2 is)
So
extra term

18/ 19

Lecture 13 : The Exponential Distribution


Now from (])

P (Y > t ) = P (W > t )P (X3 > t )


= (2e −λt − e −2λt )e −λt
P (Y > t ) = 2e −2λt − e −3λt

so the cdf of Y is given by

P (Y ≤ t ) = 1 − P (Y > t )
= 1 − 2e −2λt + e −3λt

That’s good enough.

19/ 19

Lecture 13 : The Exponential Distribution

You might also like