Stochastic Process
Introduction
Noha Youssef
The American University in Cairo
nayoussef@aucegypt.edu
1 / 13
Table of Contents
Introduction
Definition of a Stochastic Process
Examples
Markov Property
Deterministic and Stochastic Models
2 / 13
Introduction
Stochastic processes are systems which evolve in time (usually)
whilst undergoing random fluctuations. We need to understand how
the world works. This can happen through understanding chance:
by learning what can be said about unpredictable events and by
learning how to manipulate the occurrence of unpredictable events.
3 / 13
Examples
Below is a brief list of some important areas in which stochastic
processes arise:
Economics daily stock market quotations or monthly
unemployment figures.
Social sciences population birth rates and school enrollments series
have been followed for many centuries in several
countries.
Epidemiology numbers of influenza cases are often monitored over
long periods of time.
Medicine blood pressure measurements are traced over time to
evaluate the impact of pharmaceutical drugs used in
treating hypertension.
Markov Chain Monte Carlo (MCMC) is a modern statistical tool
which brings together results from SPs and
simulation enabling potentially computationally
intractable (Bayesian) statistics to be used in
practice.
4 / 13
Definition of a Stochastic Process
Definition 2.1
A stochastic process is a family of random variables {Xθ },
indexed by a parameter θ, where θ belongs to some index set θ.
If θ is a set of integers, representing specific time points, we have a
stochastic process in discrete time and we shall replace the general
subscript θ by t.
In a spatial process, θ would be a vector, representing location in
space rather than time.
Guess the definition of spatio-temporal processes!
5 / 13
Definitions
Definition 2.2
The index t is often interpreted as time and, thus, X(t) is referred
to as the state of the process at time t.
Definition 2.3
The set T is called index set (or parameter set).
Definition 2.4
If T is a countable set (e.g., N0 or Z) then we are dealing with a
discrete-time process.
Definition 2.5
If T is a continuum (e.g., R) then {X(t) : t ∈ T } is said to be a
continuous-time process.
6 / 13
More Definitions
Definition 2.6
For each t ∈ T , X(t) is a r.v. Any realization of the stochastic
process {X(t) : t ∈ T } is called a sample path.
Definition 2.7
The set of all possible values that the r.v. X(t) can take at all t,
say S, is said to be the state space of the stochastic process
{X(t) : t ∈ T }.
Definition 2.8
If S is a countable set (e.g., N0 or Z), {X(t) : t ∈ T } is a discrete
value process.
7 / 13
Example: Gambler’s ruin
Player A has $k, Player B has $(a − k), a > k > 0 play a series of
games in which A has prob. p of winning and q = 1 − p of losing.
Define r.v. Xn : A’s money after n games. Then (x1 , x2 , · · · , xn )
is a realisation of a discrete time random process.
8 / 13
Cont’D
Definition 2.9
If S is a continuum (e.g.,R) then {X(t) : t ∈ T } is a continuous
value process.
Definition 2.10
Discrete Time random process is observed only at specific times.
9 / 13
More Examples
Continuous Time random process
e.g. Number of customers in queue for ATM
Discrete State Space
e.g. Gambler’s ruin.
e.g. ATM customers
Continuous State Space
e.g. The closing price for a given stock per day is a discrete-time
stochastic process with a continuous state space.
10 / 13
More Definitions
I A counting process is a process X(t) in discrete or
continuous time for which the possible values of X(t) are the
natural numbers (0, 1, 2, · · · ) with the property that X(t) is a
non-decreasing function of t. Often, X(t) can be thought of
as counting the number of ’events’ of some type that have
occurred by time t. The basic example of a counting process
is the Poisson process.
I A sample path of a stochastic process is a particular
realization of the process, i.e. a particular set of values X(t)
for all t (which may be discrete of continuous), generated
according to the (stochastic) ‘rules’ of the process.
I The increments of a process are the changes X(t) − X(s)
between time points s and t, (s < t).
11 / 13
Markov Property
We are often interested in conditional distributions of the form
P r(Xtk |Xtk−1 , Xtk−2 , · · · , Xt1 )
for some times tk > tk−1 > tk−2 , · · · > t1 .
In general, this conditional distribution will depend upon values of
Xtk−1 , Xtk−2 , · · · , Xt1 . However, we shall focus particularly in this
module on processes that satisfies the Markov property, which says
that
P r(Xtk |Xtk−1 , Xtk−2 , · · · , Xt1 ) = P r(Xtk |Xtk−1 ).
Given the present (Xk−1 ), the future (Xk ) is independent of the
past (Xk−2 , Xk−3 , · · · , X1 ).
The Markov property is sometimes referred to as the lack of memory
property.
12 / 13
Deterministic and Stochastic Models
A deterministic model is specified by a set of equations that de-
scribe exactly how the system will evolve over time.
In a stochastic model, the evolution is at least partially random and
if the process is run several times, it will not give identical results.
Different runs of a stochastic process are often called realizations
of the process.
13 / 13