Review of Random Processes
Lecture 14
EEE 352 Analog Communication Systems
Mansoor Khan
Electrical Engineering Dept.
CIIT Islamabad Campus
Random Variable
A random variable is a mapping function which assigns
outcomes of a random experiment to real numbers.
Occurrence of the outcome follows certain probability
distribution. Therefore, a random variable is completely
characterized by its probability density function (PDF).
Random Process
A random variable that is a function of
time is called a random or stochastic
process. Thus a random process is a
collection of infinite number of random
variables.
Examples of random processes
Daily stream flow
Hourly rainfall of storm events
Stock index
Stochastic process
Example: Let X(t) be the number of people in a particular railway
station from 8Am to t (>=8). Clearly, for each given t, X(t) is a random
variable. Table below lists some possible values that X(t) could take:
date X(9) X(10) X(11) X(12) X(13) X(14) …
…
Oct. 12 1360 1412 1750 1603 1598 1821
…
Oct. 13 1362 1490 1713 1641 1601 1845
…
Oct. 14 1289 1472 1739 1593 1614 1864
…
Oct. 15 1313 1453 1721 1631 1622 1871
Oct. 16 1368 1481 ? ? ? ? ?
Oct. 17 ? ? ? ? ? ? ?
Stochastic Process
• Stochastic processes are processes that proceed randomly in time.
• Rather than consider fixed random variables X, Y, etc. or even
sequences of random variables, we consider sequences X0, X1,
X2, …. Where Xt represent some random quantity at time t.
• In general, the value Xt might depend on the quantity Xt-1 at time t-1,
or even the value Xs for other times s < t.
• Example: people at railway station from 8 AM onwards .
Continuous and Discrete Time Stochastic
Process
• A stochastic process is a family of time indexed random variables Xt
where t belongs to an index set. Formal notation, X t : t I where I is
an index set that is a subset of R.
• Examples of index sets:
1) I = (-∞ to ∞ or I = [0, ∞. In this case Xt is a continuous time
stochastic process.
2) I = {0, ±1, ±2, ….} or I = {0, 1, 2, …}. In this case Xt is a
discrete time stochastic process.
• We use uppercase letter {Xt } to describe the process. A time series,
{xt } is a realization or sample function from a certain process.
• We use information from a time series to estimate parameters and
properties of process {Xt }.
Probability Distribution of a Process
• For any stochastic process with index set I, its probability
distribution function is uniquely determined by its finite
dimensional distributions.
• The k dimensional distribution function of a process is defined by
FX t1 ,...,X tk x1 ,..., xk P X t1 x1 ,..., X tk xk
for any t1 ,..., tk I and any real numbers x1, …, xk .
• The distribution function tells us everything we need to know about
the process {Xt }.
Statistical Averages Summary
∞
• The first moment of a
mX = E{X} = ∫ x pX(x) dx
-∞ probability distribution of a
random variable X is called
∞
mean value mX, or expected
E{X2} = ∫ x2 pX(x) dx value of a random variable X
-∞
• The second moment of a
probability distribution is the
Var(X) = E{(X – mX)2 } mean-square value of X
=∞∫ (x – mX)2 • Central moments are the
px(x) dx -∞ moments of the difference
between X and mX and the
second central moment is the
Var(X) =E{X2} –(E{X})2 variance of X
• Variance is equal to the difference
between the mean-square value
and the square of the mean
Stationary Processes
• A process is said to be strictly stationary if X t ,..., X t 1 k
has the same
joint distribution as X t ,..., X t . That is, if
1 k
FX t1 ,...,X tk x1 ,..., xk FX t1 ,...,X tk x1 ,..., xk
• If {Xt } is a strictly stationary process and E X t2 then, the mean
function is a constant and the variance function is also a constant.
• Moreover, for a strictly stationary process with first two moments
finite, the covariance function, and the correlation function depend
only on the time difference s.
Weak Stationarity
• Strict stationary is too strong of a condition in practice. It is often
difficult assumption to assess based on an observed time series x1,…,xk.
• In time series analysis we often use a weaker sense of stationary in
terms of the moments of the process.
• A process is said to be nth-order weakly stationary if all its joint
moments up to order n exists and are time invariant, i.e., independent
of time origin.
• For example, a second-order weakly stationary process will have
constant mean and variance, with the covariance and the correlation
being functions of the time difference along.
• A strictly stationary process with the first two moments finite is also a
second-ordered weakly stationary. Also called Wide sense Stationary
process.
First order stationary process
The probability density function of a first
order stationary process satisfies
px ( x1 , t1 ) px ( x1 , t1 ) for all t1 ,
If x(t) is a first order stationary process
then
E{x(t )} E{x(t )} x constant
Second order stationary process
The Second order density function of a
second order stationary process satisfies
px ( x1 , x2 , t1 , t2 ) px ( x1 , x2 , t1 , t2 )
for all t1 , t2 ,
If x(t) is a second order stationary
process then
Rxx (t1 , t1 } E{x(t1 ) x(t1 )} Rxx ( )
Wide Sense Stationary
A process is Wide Sense Stationary process if
E{x(t ) } x constant
E{x(t1 ) x(t1 )} Rxx ( )
Second order stationary Wide Sense Stationary
process.
•
Estimation of the mean
• Given a single realization {xt} of a stationary process {Xt}, a natural
estimator of the mean E X t is the sample mean
1 n
x xt
n t 1
which is the time average of n observations.
Sample Autocovariance Function
• Given a single realization {xt} of a stationary process {Xt}, the
sample autocovariance function given by
1 nk
ˆ k xt x xt k x
n t 1
is an estimate of the autocavariance function.
Correlation Functions
The correlatio n of x1 (t ), and x2 (t )
Rxx (t1 , t 2 ) E{x1 (t1 ) x2 (t 2 )}
Time Average
The time average of x(t)
1 T
T . A. of x lim
T 2T T
x(t )dt
The time autocorrelation function is
1 T
R xx ( ) lim
T 2T T
x(t ) x(t ) dt
Ergodic Process
If the time average and time autocorrelation are
equal to the statistical average and statistical
autocorrelation then the process is ergodic.
1 T
x(t ) p( x) dx Tlim
2T T
x(t )dt
1 T
( x(t ) x(t ) ) p( x) dx Tlim
2T
T
x(t ) x(t ) dt
Ergodity is very restrictive. The assumption of
ergodity is used to simplify the problem
Properties of PSD
Some properties of PSD are:
• Px(f ) is always real
• Px(f ) > 0
• When x(t) is real, Px(-f )= Px(f )
• If x(t) is WSS,
Px f df P Total Normalized Power
____
Px f df P x Rx 0
2
• PSD at zero frequency is:
Px 0 Rx d
Multiple Random Processes
Multiple Random Processes (cont)
Linear Systems
• Recall that for LTI systems:
y t h t x t Y f H f X f
• This is still valid if x and y are random processes, x
might be signal plus noise or just noise
• What is the autocorrelation and PSD for y(t) when x(t)
is known?
x(t) y t h t x t
X(f ) Linear Network Y f H f X f
Rx() h(t)
Px(f ) Ry ( ) h h Rx ( )
H(f )
Py ( f ) H ( f ) Px ( f )
2
Output of an LTI System
• Theorem: If a WSS random process x(t) is applied to a LTI
system with impulse response h(t), the output
autocorrelation is:
Ry y (t ) y (t ) h 1 x t 1 d 1 h 2 x t 2 d 2
h h R
1 2 x 2 1 d 1d 2
h h Rx
• And the output PSD is:
Py f H f Px f
2
• The power transfer function is: Py f
Gh f Hf
2
Px f