Mathematical Expectation
The expected value of a discrete random variable is a weighted average of all possible values of the random variable,
where the weights are the probabilities associated with the corresponding values.
   For Discrete Random Variable
                                                          X
                                                 E(X) =      xf (x)
   where x represents each possible value of the random variable, and f (x) is the probability of that value occurring.
   For Continuous Random Variable                      Z ∞
                                                       E(X) =           xf (x) dx
                                                                                    eJ
                                                                   −∞
   where f (x) is the probability density function (p.d.f.) of the random variable.
Expected Value of a Function of a Random Variable
                                                                             in
Consider a random variable X with p.d.f. f (x) and distribution function F (x). If g(.) is a function such that g(X)
is a random variable and E(g(X)) exists, then
                                                                         erl
For a continuous random variable:                     Z ∞
                                           E[g(X)] =       g(x)f (x) dx
                                                                  −∞
For a discrete random variable:
                                                       J
                                                                   X
                                                       E[g(X)] =        g(x)f (x)
                                                    va              x
Properties of Expectation
Property 1 :Addition Theorem of Expectation
                                                  pa
The expected value of the sum of random variables is the sum of their individual expected values.
   If X and Y are random variables, then
                                               lor
                                                     E(X + Y ) = E(X) + E(Y )
provided all the expectations exist.
Generalization:
                                     a
    The mathematical expectation of the sum of n random variables is equal to the sum of their expectations, provided
all the expectations exists. Symbolically, if X1 , X2 , . . . , Xn are random variables, then
                                  Am
                                    E(X1 + X2 + . . . + Xn ) = E(X1 ) + E(X2 ) + . . . + E(Xn )
Property 2 :Multiplication Theorem of Expectation
The expected value of the product of random variables is the product of their individual expected values.
   If X and Y are independent random variables, then:
                       .
                                                        E(XY ) = E(X)E(Y )
                    Dr
   Generalization: The mathematical expectation of the product of a number of independent random variables is
equal to the product of their expectations. Symbolically, if X1 , X2 , . . . , Xn are n random variables, then
                                             E(X1 X2 . . . Xn ) = E(X1 )E(X2 ) . . . E(Xn )
             Pn              Pn
That is E[    i=1   Xi ] =    i=1   E(Xi )
                                                                   1
Property 3: Constant Multiplication:
If X is a random variable and a is constant, then
                                                        E(aψ(X)) = aE(ψ(X))
                                                     E(ψ(X) + a) = E(ψ(X)) + a
               Qn                Qn
   That is E[     i=1   Xi ] =    i=1   E(Xi )
   Corollary
    i If ψ(X) = X then,
                                                            E(a(X)) = aE((X))
                                                                                   eJ
      and
                                                           E(X + a) = E(X) + a
                                                                           in
   ii If ψ(X) = 1 then,
                                                                 E(a) = a
                                                                       erl
Property 4: Linearity Property
If X is a random variable and a and b are constants, then
                                                         J
                                                       E(aX + b) = aE(X) + b
provided all the expectations exists.
                                                      va
   Corollary:
                                                    pa
   • If b=0, then E(aX) = aE(X)
   • If a=1 and b = −X = −E(X), then
                                                 lor
                                                  E(X − X) = E(X) − E(X) = X − X = 0
0.1     Property 5 :Expectation of a Linear combination of Random Variables
                                    a
If X
   P1n, X2 , . . . , XnPare n random variables and if a1 , a2 , . . . , an are n constants, then
                         n
E[ i=1 ai Xi ] = i=1 ai E(Xi ) provided all the expectations exist.
                                 Am
 Property 6
If X ≥ 0 then E(X) ≥ 0.
 Property 7
                     .
If X and Y are two random variables such that Y ≤ X, then E(Y ) ≤ E(X) provided all the expectations exist.
                  Dr
 Property 8
|E(X)| ≤ E(|X|) provided all the expectations exist.
 Property 9
If µ′r exists, then µ′s for all 1 ≤ s ≤ r.
 Property 10
If X and Y are two random variables, then
                                                 E(h(X) + k(Y )) = E(h(X)) + E(k(Y ))