0% found this document useful (0 votes)
68 views4 pages

CH 9

This document discusses several probability and statistics concepts including: why the probability mass function of a joint distribution equals the marginal distribution of one of the random variables; why we do not sum over all values of a random variable; the difference between independent random processes and having different independent and identically distributed random processes; and finding the covariance matrix and mean vector as the solution for a particular problem.

Uploaded by

kadry mohamed
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
68 views4 pages

CH 9

This document discusses several probability and statistics concepts including: why the probability mass function of a joint distribution equals the marginal distribution of one of the random variables; why we do not sum over all values of a random variable; the difference between independent random processes and having different independent and identically distributed random processes; and finding the covariance matrix and mean vector as the solution for a particular problem.

Uploaded by

kadry mohamed
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

Why PMF of joint = marginal of P (N)

Why we do not sum over all value of n

In d the transformation is the identity matrix


In problem 1 the rvs are not independent so why we test law of large
number.

why g(t + T) = 0 for t+T outside (0,1)

What is the different between Independent random process and, have different
iid RPs
Why in b is no Ind increment

Why g(u) = 1 or 0 it must be continuous


The solution for c that I should find the covariance matrix and mean vector

You might also like