More Probability Theory
Continuous Random Variables
Uniform Distribution
Normal Distribution
Moment Generating Functions
Characteristic Functions
Poisson Distribution
Binomial Distribution
Uniform Distribution
Normal Distribution
MGF Properties
Function of Random Variables
Obtaining the PDFs
A random variable \(X\) is considered continuous if the \(P(X=x)\) does not exist.
The cumulative distribution function of \(X\) provides the \(P(X\leq x)\), denoted by \(F(x)\), for the domain of \(X\).
Properties of the CDF of \(X\):
The probability density function of the random variable \(X\) is given by
\[ f(x)=\frac{dF(x)}{d(x)}=F^\prime(x) \]
wherever the derivative exists.
Properties of pdfs:
The expected value for a continuous distribution is defined as
\[ E(X)=\int x f(x)dx \]
The expectation of a function \(g(X)\) is defined as
\[ E\{g(X)\}=\int g(x)f(x)dx \]
The variance of continuous variable is defined as
\[ Var(X) = E[\{X-E(X)\}^2] = \int \{X-E(X)\}^2 f(x)dx \]
Continuous Random Variables
Uniform Distribution
Normal Distribution
Moment Generating Functions
Characteristic Functions
Poisson Distribution
Binomial Distribution
Uniform Distribution
Normal Distribution
MGF Properties
Function of Random Variables
Obtaining the PDFs
A random variable is said to follow uniform distribution if the density function is constant between two parameters.
\[ f(x) = \left\{\begin{array}{cc} \frac{1}{b-a} & a \leq x \leq b\\ 0 & \mathrm{elsewhere} \end{array}\right. \]
Continuous Random Variables
Uniform Distribution
Normal Distribution
Moment Generating Functions
Characteristic Functions
Poisson Distribution
Binomial Distribution
Uniform Distribution
Normal Distribution
MGF Properties
Function of Random Variables
Obtaining the PDFs
A random variable is said to follow a normal distribution if the the frequency of occurrence follow a Gaussian function.
\[ f(x)=\frac{1}{\sqrt{2\pi \sigma^2}}\exp\left\{-\frac{(x-\mu)^2}{2\sigma^2}\right\} \]
Continuous Random Variables
Uniform Distribution
Normal Distribution
Moment Generating Functions
Characteristic Functions
Poisson Distribution
Binomial Distribution
Uniform Distribution
Normal Distribution
MGF Properties
Function of Random Variables
Obtaining the PDFs
The \(k\)th moment is defined as the expectation of the random variable, raised to the \(k\)th power, defined as \(E(X^k)\).
The moment generating functions is used to obtain the \(k\)th moment. The mgf is defined as
\[ m(t) = E(e^{tX}) \]
The \(k\)th moment can be obtained by taking the \(k\)th derivative of the mgf, with respect to \(t\), and setting \(t\) equal to 0:
\[ E(X^k)=\frac{d^km(t)}{dt}\Bigg|_{t=0} \]
Continuous Random Variables
Uniform Distribution
Normal Distribution
Moment Generating Functions
Characteristic Functions
Poisson Distribution
Binomial Distribution
Uniform Distribution
Normal Distribution
MGF Properties
Function of Random Variables
Obtaining the PDFs
\[ \phi(t) = E\left(e^{itX}\right) = E\left\{\cos(tX)\right\} + iE\left\{\sin(tX)\right\} \]
Continuous Random Variables
Uniform Distribution
Normal Distribution
Moment Generating Functions
Characteristic Functions
Poisson Distribution
Binomial Distribution
Uniform Distribution
Normal Distribution
MGF Properties
Function of Random Variables
Obtaining the PDFs
Continuous Random Variables
Uniform Distribution
Normal Distribution
Moment Generating Functions
Characteristic Functions
Poisson Distribution
Binomial Distribution
Uniform Distribution
Normal Distribution
MGF Properties
Function of Random Variables
Obtaining the PDFs
Continuous Random Variables
Uniform Distribution
Normal Distribution
Moment Generating Functions
Characteristic Functions
Poisson Distribution
Binomial Distribution
Uniform Distribution
Normal Distribution
MGF Properties
Function of Random Variables
Obtaining the PDFs
Continuous Random Variables
Uniform Distribution
Normal Distribution
Moment Generating Functions
Characteristic Functions
Poisson Distribution
Binomial Distribution
Uniform Distribution
Normal Distribution
MGF Properties
Function of Random Variables
Obtaining the PDFs
Continuous Random Variables
Uniform Distribution
Normal Distribution
Moment Generating Functions
Characteristic Functions
Poisson Distribution
Binomial Distribution
Uniform Distribution
Normal Distribution
MGF Properties
Function of Random Variables
Obtaining the PDFs
Let \(X\) follow a distribution \(f\), with the an MGF \(M_X(t)\), the MGF of \(Y=aX+b\) is given as
\[ M_Y(t) = e^{tb}M_X(at) \]
Let \(X\) and \(Y\) be two random variables with MGFs \(M_X(t)\) and \(M_Y(t)\), respectively, and are independent. The MGF of \(U=X-Y\)
\[ M_U(t) = M_X(t)M_Y(-t) \]
Let \(X\) and \(Y\) have the following distributions \(F_X(x)\) and \(F_Y(y)\) and MGFs \(M_X(t)\) and \(M_Y(t)\), respectively. \(X\) and \(Y\) have the same distribution \(F_X(x)=F_Y(y)\) if and only if \(M_X(t)=M_Y(t)\).
Let \(X_1,\cdots, X_n\) be independent random variables, where \(X_i\sim N(\mu_i, \sigma^2_i)\), with \(M_{X_i}(t)=\exp\{\mu_i t+\sigma^2_it^2/2\}\) for \(i=1,\cdots, n\). Find the MGF of \(Y=a_1X_1+\cdots+a_nX_n\), where \(a_1, \cdots, a_n\) are constants.
Continuous Random Variables
Uniform Distribution
Normal Distribution
Moment Generating Functions
Characteristic Functions
Poisson Distribution
Binomial Distribution
Uniform Distribution
Normal Distribution
MGF Properties
Function of Random Variables
Obtaining the PDFs
Continuous Random Variables
Uniform Distribution
Normal Distribution
Moment Generating Functions
Characteristic Functions
Poisson Distribution
Binomial Distribution
Uniform Distribution
Normal Distribution
MGF Properties
Function of Random Variables
Obtaining the PDFs
Let there be a random variable \(X\) with a known distribution function \(F_X(x)\), the density function for the random variable \(Y=g(X)\) can be found with the following steps
Let \(X\) have the following probability density function:
\[ f_X(x)=\left\{\begin{array}{cc} 2x & 0\le x \le 1 \\ 0 & \mathrm{otherwise} \end{array} \right. \]
Find the probability density function of \(Y=3X-1\)?
Let there be a random variable \(X\) with a known distribution function \(F_X(x)\), if the random variable \(Y=g(X)\) is either increasing or decreasing, than the probability density function can be found as
\[ f_Y(y) = f_X\{g^{-1}(y)\}\left|\frac{dg^{-1}(y)}{dy}\right| \]
Let \(X\) have the following probability density function:
\[ f_X(x)=\left\{\begin{array}{cc} \frac{3}{2}x^2 + x & 0\le y \le 1 \\ 0 & \mathrm{otherwise} \end{array} \right. \]
Find the probability density function of \(Y=5-(X/2)\)?
Using the uniqueness property of Moment Generating Functions, for a random variable \(X\) with a known distribution function \(F_X(x)\) and random variable \(Y=g(X)\), the distribution of \(Y\) can be found by:
Let \(X\) follow a normal distribution with mean \(\mu\) and variance \(\sigma^2\). Find the distribution of \(Z=\frac{X-\mu}{\sigma}\).
Let \(Z\) follow a standard normal distribution with mean \(0\) and variance \(1\). Find the distribution of \(Y=Z^2\)