408616 DIGITAL COMMUNICATION
Lecture-4
                      Chapter 1
1.5 RANDOM VARIABLES
 All useful message signals appear random; that is, the receiver
 does not know, a priori, which of the possible waveform have
 been sent.
 Let a random variable X(A) represent the functional relationship
 between a random event A and a real number. The random
 variable may be discrete or continuous.
                                                                     2
(CUMULATIVE) DISTRIBUTION FUNCTION
 The (cumulative) distribution function FX(x) of the random variable X is
 given by
              Fx(x) = P(X ≤ x)                         (1.24)
  where P(X ≤ x) is the probability that the value taken by the random
  variable X is less than or equal to a real number x.
 The distribution function FX(x) has the following properties
 1. 0  FX  x   1
 2. FX  x1   FX  x2       if x1  x2
 3. FX     0
 4. FX     1
                                                                        3
PROBABILITY DENSITY FUNCTION
 Another useful function relating to the random variable X
 is the probability density function (pdf)
                     𝑑
          pX(x) =      𝐹 (𝑥)                      (1.25)
                     𝑑𝑥 𝑋
 The pdf has the following properties
1. p X ( x)  0
     
2.   p
     
          X   ( x)dx  FX ()  FX ()  1
 The pdf is a non negative function with a total area of 1.
                                                               4
    STATISTICAL AVERAGES FOR A RANDOM
                 VARIABLE
The   first moment of a                                 
 probability distribution of a
 random variable X is called
 mean value mX, or expected
                                   m X  E{ X }          xp
                                                         
                                                                X   ( x)dx
 value of a random variable X
 (Ensemble average)
                                                     
The  second moment of a
 probability distribution is the      E{ X 2 }   x 2 pX ( x)dx
 mean-square value of X                           
Central   moments are the          var( X )  E{( X  m X )2 }
 moments of the difference
 between X and mX and the                        
 second central moment is the
                                                            2
 variance σ of X                                   ( x   m X )   pX ( x)dx
                                                 
Variance   is equal to the
 difference between the mean-
 square value and the square of    var( X )   X2  E{ X 2 }  E{ X }2
 the mean.
                                                                             5
The effect of variance on Gaussian distribution   6
  1.5.2 RANDOM PROCESSES
 A random process (RP) 𝑋(𝐴, 𝑡) can be viewed as a function of two variables:
 an event A and time.
 The totality of all sample functions is called an ensemble
 By observing the random process at time 𝑡𝑘 , a RV 𝑋(𝑡𝑘 ) is obtained .
 The pdf of this RV represents the density over the ensemble of events at
                                                                                7
 time 𝑡𝑘 and is represented as 𝑝𝑋(𝑡    ).
                                   𝑘
             STATISTICAL AVERAGES OF A RP
 A random process whose distribution functions are continuous can
  be described statistically with a pdf.
 Mean of the random process X(t) :
                              
                                                                   (1.30)
              E{ X (tk )}    
                              
                                   xp X k ( x) dx  mX (tk )
   where 𝑋(𝑡𝑘 ) is the RV obtained by observing the random process at
   time 𝑡𝑘 and the pdf of 𝑋(𝑡𝑘 ) is 𝑝𝑋(𝑡 ).
                                        𝑘
 Autocorrelation function of the random process X(t)
                    R (t , t )  E{ X (t ) X (t )}           (1.31)
                      X   1        2                 1         2
 The autocorrelation function is a measure of the degree to which two
  time samples from the same RP are related.
 A partial description consisting of the mean and autocorrelation
  function are often adequate for the needs of communication systems.
                                                                            8
   1.5.2.2 STATIONARITY
 A random process X(t) is said to be stationary in the strict sense if
  none of its statistics are affected by a shift in the time origin.
 A random process is said to be wide-sense stationary (WSS) if two of
  its statistics, its mean and autocorrelation function, do not vary with a
  shift in the time origin.
                                                             (1.32)
                      E{ X (t )}  mX  a constant
                       RX (t1 , t2 )  RX (t1  t2 )      (1.33)
 For stationary processes, the autocorrelation function does not
  depend on time but the difference between 𝑡1 and 𝑡2 i.e., all pair of
  values of X(t) at points in time separated by 𝜏 = 𝑡1 − 𝑡2 have the same
  values and can be represented as simply 𝑅𝑋 (𝜏).
 From a practical point of view, it is not necessary for a RP to be
  stationary for all time but only for some observation interval of
  interest.                                                                   9
WIDE SENSE STATIONARY PROCESS (WSS)
 A Wide Sense Stationary process is characterized by:
1.     A constant mean over time.
2.     A time-invariant autocovariance function, which depends
       only on the time difference (lag) between two points.
3.     A finite and constant variance.
 WSS is an important concept in signal processing and
     communication systems, as many signals and noise models
     assume WSS properties for simplification.
 White Noise: White noise is an example of a WSS process. It
     has a constant mean (usually zero), constant variance, and its
     autocovariance function depends only on the time difference,
     not on the actual time.
                                                                      10
    1.5.2.3 AUTOCORRELATION OF A WIDE-SENSE STATIONARY
    RANDOM PROCESS
     For a wide-sense stationary process, the autocorrelation function
      is only a function of the time difference 𝜏 = 𝑡1 − 𝑡2 ;
             RX ( )  E{ X (t ) X (t   )}   for        (1.34)
     The autocorrelation function measures the randomness of RPs.
     Properties of the autocorrelation function of a real-valued wide-
      sense stationary process are
        𝑅𝑋 𝜏 = 𝑅𝑋 −𝜏            symmetrical in τ about zero
      𝑅𝑋 𝜏 ≤ 𝑅𝑋 0 for all 𝜏     maximum value occurs at the origin
         𝑅𝑋 𝜏 ↔ 𝐺𝑋 (𝑓)          autocorrelation and PSD form a Fourier
                                transform pair
        𝑅𝑋 0 = 𝐸{𝑋 2 𝑡 }        value at the origin is equal to average
                                power of the signal
                                                                          11
1.5.3. TIME AVERAGING & ERGODICITY
  When a random process belongs to a special class, known as an
  ergodic process, its time averages equal its ensemble averages.
  The statistical properties of such processes can be determined by time
  averaging over a single sample function of the process.
  For a RP to be ergodic, it must be stationary in the strict sense (the
  converse is not necessary), however, for CS we are interested in WSS.
  A random process is ergodic in the mean if
                                              T /2
                                      1
                                 x  T 
                            mX  lim           X (t )dt            (1.35)
                                        T / 2
  It is ergodic in the autocorrelation function if:
                                       T /2
                                    1                              (1.36)
                               x  T 
                     RX ( )  lim           X (t ) X (t   )dt
                                      T / 2
  Testing for the ergodicity of a RP is usually very difficult. A reasonable
  assumption in the analysis of most CS is that random waveforms are
  ergodic in mean and autocorrelation function.                   12
ELECTRICAL PARAMETERS & ERGODIC PROCESS
                                          13
    EXAMPLES OF ERGODIC PROCESSES IN EE
 Noise in Communication Systems: Noise in communication systems, such as thermal noise,
    shot noise, and impulse noise, can be modeled as ergodic processes
   Random Telegraph Signals: Random Telegraph Signals (RTS) are used to model the
    behavior of digital signals in noisy channels.
   Random Processes in Power Systems: Power systems can be modeled using ergodic
    processes to analyze the behavior of random fluctuations in voltage and current.
   Ergodic Modeling of Radar Signals: Radar signals can be modeled as ergodic processes to
    analyze the effects of noise and interference on target detection and tracking.
   Random Processes in Image and Video Compression: Image and video compression
    algorithms often use ergodic processes to model the behavior of random noise and artifacts
    in the compressed data.
   Ergodic Modeling of Audio Signals: Audio signals can be modeled as ergodic processes to
    analyze the effects of noise and distortion on audio quality.
   Random Processes in Control Systems: Control systems can be modeled using ergodic
    processes to analyze the behavior of random disturbances and uncertainties in the system.
   Ergodic Modeling of Electromagnetic Interference (EMI): EMI can be modeled as an
    ergodic process to analyze the effects of electromagnetic noise on electronic systems.
   Random Processes in Biomedical Signal Processing: Biomedical signals, such as ECG and
    EEG, can be modeled as ergodic processes to analyze the behavior of random noise and
    artifacts in the signals.
   Ergodic Modeling of Optical Communication Systems: Optical communication systems
    can be modeled using ergodic processes to analyze the effects of random noise and
    interference on signal transmission and reception.
                                                                                                 14