Review:

More Probability Theory

Continuous Random Variables

  • Continuous Random Variables

  • Uniform Distribution

  • Normal Distribution

  • Moment Generating Functions

  • Characteristic Functions

  • Poisson Distribution

  • Binomial Distribution

  • Uniform Distribution

  • Normal Distribution

  • MGF Properties

  • Function of Random Variables

  • Obtaining the PDFs

Continuous Random Variables

A random variable \(X\) is considered continuous if the \(P(X=x)\) does not exist.

CDF

The cumulative distribution function of \(X\) provides the \(P(X\leq x)\), denoted by \(F(x)\), for the domain of \(X\).

Properties of the CDF of \(X\):

  1. \(F(-\infty)\equiv \lim_{y\rightarrow -\infty}F(y)=0\)
  2. \(F(\infty)\equiv \lim_{y\rightarrow \infty}F(y)=1\)
  3. \(F(x)\) is a nondecreaseing function

PDF

The probability density function of the random variable \(X\) is given by

\[ f(x)=\frac{dF(x)}{d(x)}=F^\prime(x) \]

wherever the derivative exists.

Properties of pdfs:

  1. \(f(x)\geq 0\)
  2. \(\int^\infty_{-\infty}f(x)dx=1\)
  3. \(P(a\leq X\leq b) = P(a<X<b)=\int^b_af(x)dx\)

Expected Value

The expected value for a continuous distribution is defined as

\[ E(X)=\int x f(x)dx \]

The expectation of a function \(g(X)\) is defined as

\[ E\{g(X)\}=\int g(x)f(x)dx \]

Expected Value Properties

  1. \(E(c)=c\), where \(c\) is constant
  2. \(E\{cg(X)\}=cE\{g(X)\}\)
  3. \(E\{g_1(X)+g_2(X)+\cdots+g_n(X)\}=E\{g_1(X)\}+E\{g_2(X)\}+\cdots+E\{g_n(X)\}\)

Variance

The variance of continuous variable is defined as

\[ Var(X) = E[\{X-E(X)\}^2] = \int \{X-E(X)\}^2 f(x)dx \]

Uniform Distribution

  • Continuous Random Variables

  • Uniform Distribution

  • Normal Distribution

  • Moment Generating Functions

  • Characteristic Functions

  • Poisson Distribution

  • Binomial Distribution

  • Uniform Distribution

  • Normal Distribution

  • MGF Properties

  • Function of Random Variables

  • Obtaining the PDFs

Uniform Distribution

A random variable is said to follow uniform distribution if the density function is constant between two parameters.

\[ f(x) = \left\{\begin{array}{cc} \frac{1}{b-a} & a \leq x \leq b\\ 0 & \mathrm{elsewhere} \end{array}\right. \]

Expected Value

Normal Distribution

  • Continuous Random Variables

  • Uniform Distribution

  • Normal Distribution

  • Moment Generating Functions

  • Characteristic Functions

  • Poisson Distribution

  • Binomial Distribution

  • Uniform Distribution

  • Normal Distribution

  • MGF Properties

  • Function of Random Variables

  • Obtaining the PDFs

Normal Distribution

A random variable is said to follow a normal distribution if the the frequency of occurrence follow a Gaussian function.

\[ f(x)=\frac{1}{\sqrt{2\pi \sigma^2}}\exp\left\{-\frac{(x-\mu)^2}{2\sigma^2}\right\} \]

Expected Value

Moment Generating Functions

  • Continuous Random Variables

  • Uniform Distribution

  • Normal Distribution

  • Moment Generating Functions

  • Characteristic Functions

  • Poisson Distribution

  • Binomial Distribution

  • Uniform Distribution

  • Normal Distribution

  • MGF Properties

  • Function of Random Variables

  • Obtaining the PDFs

Moments

The \(k\)th moment is defined as the expectation of the random variable, raised to the \(k\)th power, defined as \(E(X^k)\).

Moment Generating Functions

The moment generating functions is used to obtain the \(k\)th moment. The mgf is defined as

\[ m(t) = E(e^{tX}) \]

The \(k\)th moment can be obtained by taking the \(k\)th derivative of the mgf, with respect to \(t\), and setting \(t\) equal to 0:

\[ E(X^k)=\frac{d^km(t)}{dt}\Bigg|_{t=0} \]

Characteristic Functions

  • Continuous Random Variables

  • Uniform Distribution

  • Normal Distribution

  • Moment Generating Functions

  • Characteristic Functions

  • Poisson Distribution

  • Binomial Distribution

  • Uniform Distribution

  • Normal Distribution

  • MGF Properties

  • Function of Random Variables

  • Obtaining the PDFs

Characteristic Functions

\[ \phi(t) = E\left(e^{itX}\right) = E\left\{\cos(tX)\right\} + iE\left\{\sin(tX)\right\} \]

Poisson Distribution

  • Continuous Random Variables

  • Uniform Distribution

  • Normal Distribution

  • Moment Generating Functions

  • Characteristic Functions

  • Poisson Distribution

  • Binomial Distribution

  • Uniform Distribution

  • Normal Distribution

  • MGF Properties

  • Function of Random Variables

  • Obtaining the PDFs

MGF

Expected Value

Variance

Variance

Binomial Distribution

  • Continuous Random Variables

  • Uniform Distribution

  • Normal Distribution

  • Moment Generating Functions

  • Characteristic Functions

  • Poisson Distribution

  • Binomial Distribution

  • Uniform Distribution

  • Normal Distribution

  • MGF Properties

  • Function of Random Variables

  • Obtaining the PDFs

MGF

Uniform Distribution

  • Continuous Random Variables

  • Uniform Distribution

  • Normal Distribution

  • Moment Generating Functions

  • Characteristic Functions

  • Poisson Distribution

  • Binomial Distribution

  • Uniform Distribution

  • Normal Distribution

  • MGF Properties

  • Function of Random Variables

  • Obtaining the PDFs

MGF

Normal Distribution

  • Continuous Random Variables

  • Uniform Distribution

  • Normal Distribution

  • Moment Generating Functions

  • Characteristic Functions

  • Poisson Distribution

  • Binomial Distribution

  • Uniform Distribution

  • Normal Distribution

  • MGF Properties

  • Function of Random Variables

  • Obtaining the PDFs

MGF

MGF Properties

  • Continuous Random Variables

  • Uniform Distribution

  • Normal Distribution

  • Moment Generating Functions

  • Characteristic Functions

  • Poisson Distribution

  • Binomial Distribution

  • Uniform Distribution

  • Normal Distribution

  • MGF Properties

  • Function of Random Variables

  • Obtaining the PDFs

Linearity

Let \(X\) follow a distribution \(f\), with the an MGF \(M_X(t)\), the MGF of \(Y=aX+b\) is given as

\[ M_Y(t) = e^{tb}M_X(at) \]

Derivation

Linearity

Let \(X\) and \(Y\) be two random variables with MGFs \(M_X(t)\) and \(M_Y(t)\), respectively, and are independent. The MGF of \(U=X-Y\)

\[ M_U(t) = M_X(t)M_Y(-t) \]

Derivation

Uniqueness

Let \(X\) and \(Y\) have the following distributions \(F_X(x)\) and \(F_Y(y)\) and MGFs \(M_X(t)\) and \(M_Y(t)\), respectively. \(X\) and \(Y\) have the same distribution \(F_X(x)=F_Y(y)\) if and only if \(M_X(t)=M_Y(t)\).

Uniqueness

Let \(X_1,\cdots, X_n\) be independent random variables, where \(X_i\sim N(\mu_i, \sigma^2_i)\), with \(M_{X_i}(t)=\exp\{\mu_i t+\sigma^2_it^2/2\}\) for \(i=1,\cdots, n\). Find the MGF of \(Y=a_1X_1+\cdots+a_nX_n\), where \(a_1, \cdots, a_n\) are constants.

Function of Random Variables

  • Continuous Random Variables

  • Uniform Distribution

  • Normal Distribution

  • Moment Generating Functions

  • Characteristic Functions

  • Poisson Distribution

  • Binomial Distribution

  • Uniform Distribution

  • Normal Distribution

  • MGF Properties

  • Function of Random Variables

  • Obtaining the PDFs

Function of Random Variables

Obtaining the PDFs

  • Continuous Random Variables

  • Uniform Distribution

  • Normal Distribution

  • Moment Generating Functions

  • Characteristic Functions

  • Poisson Distribution

  • Binomial Distribution

  • Uniform Distribution

  • Normal Distribution

  • MGF Properties

  • Function of Random Variables

  • Obtaining the PDFs

Using the Distribution Function

Let there be a random variable \(X\) with a known distribution function \(F_X(x)\), the density function for the random variable \(Y=g(X)\) can be found with the following steps

  1. Find the region of \(Y\) in the space of \(X\), find \(g^{-1}(y)\)
  2. Find the region of \(Y\le y\)
  3. Find \(F_Y(y)=P(Y\le y)\) using the probability density function of \(X\) over region \(Y\le y\)
  4. Find \(f_Y(y)\) by differentiating \(F_Y(y)\)

Example 1

Let \(X\) have the following probability density function:

\[ f_X(x)=\left\{\begin{array}{cc} 2x & 0\le x \le 1 \\ 0 & \mathrm{otherwise} \end{array} \right. \]

Find the probability density function of \(Y=3X-1\)?

Using the PDF

Let there be a random variable \(X\) with a known distribution function \(F_X(x)\), if the random variable \(Y=g(X)\) is either increasing or decreasing, than the probability density function can be found as

\[ f_Y(y) = f_X\{g^{-1}(y)\}\left|\frac{dg^{-1}(y)}{dy}\right| \]

Example 2

Let \(X\) have the following probability density function:

\[ f_X(x)=\left\{\begin{array}{cc} \frac{3}{2}x^2 + x & 0\le y \le 1 \\ 0 & \mathrm{otherwise} \end{array} \right. \]

Find the probability density function of \(Y=5-(X/2)\)?

Using the MGF

Using the uniqueness property of Moment Generating Functions, for a random variable \(X\) with a known distribution function \(F_X(x)\) and random variable \(Y=g(X)\), the distribution of \(Y\) can be found by:

  1. Find the moment generating function of \(Y\), \(M_Y(t)\).
  2. Compare \(M_Y(t)\), with known moment generating functions. If \(M_Y(t)=M_V(t)\), for all values \(t\), them \(Y\) and \(V\) have identical distributions.

Example 3

Let \(X\) follow a normal distribution with mean \(\mu\) and variance \(\sigma^2\). Find the distribution of \(Z=\frac{X-\mu}{\sigma}\).

Example 4

Let \(Z\) follow a standard normal distribution with mean \(0\) and variance \(1\). Find the distribution of \(Y=Z^2\)