More Probability Theory
Moment Generating Functions
Characteristic Functions
Poisson Distribution
Binomial Distribution
Uniform Distribution
Normal Distribution
MGF Properties
Function of Random Variables
Obtaining the PDFs
The \(k\)th moment is defined as the expectation of the random variable, raised to the \(k\)th power, defined as \(E(X^k)\).
The moment generating functions is used to obtain the \(k\)th moment. The mgf is defined as
\[ m(t) = E(e^{tX}) \]
The \(k\)th moment can be obtained by taking the \(k\)th derivative of the mgf, with respect to \(t\), and setting \(t\) equal to 0:
\[ E(X^k)=\frac{d^km(t)}{dt}\Bigg|_{t=0} \]
Moment Generating Functions
Characteristic Functions
Poisson Distribution
Binomial Distribution
Uniform Distribution
Normal Distribution
MGF Properties
Function of Random Variables
Obtaining the PDFs
\[ \phi(t) = E\left(e^{itX}\right) = E\left\{\cos(tX)\right\} + iE\left\{\sin(tX)\right\} \]
Moment Generating Functions
Characteristic Functions
Poisson Distribution
Binomial Distribution
Uniform Distribution
Normal Distribution
MGF Properties
Function of Random Variables
Obtaining the PDFs
Moment Generating Functions
Characteristic Functions
Poisson Distribution
Binomial Distribution
Uniform Distribution
Normal Distribution
MGF Properties
Function of Random Variables
Obtaining the PDFs
Moment Generating Functions
Characteristic Functions
Poisson Distribution
Binomial Distribution
Uniform Distribution
Normal Distribution
MGF Properties
Function of Random Variables
Obtaining the PDFs
Moment Generating Functions
Characteristic Functions
Poisson Distribution
Binomial Distribution
Uniform Distribution
Normal Distribution
MGF Properties
Function of Random Variables
Obtaining the PDFs
Moment Generating Functions
Characteristic Functions
Poisson Distribution
Binomial Distribution
Uniform Distribution
Normal Distribution
MGF Properties
Function of Random Variables
Obtaining the PDFs
Let \(X\) follow a distribution \(f\), with the an MGF \(M_X(t)\), the MGF of \(Y=aX+b\) is given as
\[ M_Y(t) = e^{tb}M_X(at) \]
Let \(X\) and \(Y\) be two random variables with MGFs \(M_X(t)\) and \(M_Y(t)\), respectively, and are independent. The MGF of \(U=X-Y\)
\[ M_U(t) = M_X(t)M_Y(-t) \]
Let \(X\) and \(Y\) have the following distributions \(F_X(x)\) and \(F_Y(y)\) and MGFs \(M_X(t)\) and \(M_Y(t)\), respectively. \(X\) and \(Y\) have the same distribution \(F_X(x)=F_Y(y)\) if and only if \(M_X(t)=M_Y(t)\).
Let \(X_1,\cdots, X_n\) be independent random variables, where \(X_i\sim N(\mu_i, \sigma^2_i)\), with \(M_{X_i}(t)=\exp\{\mu_i t+\sigma^2_it^2/2\}\) for \(i=1,\cdots, n\). Find the MGF of \(Y=a_1X_1+\cdots+a_nX_n\), where \(a_1, \cdots, a_n\) are constants.
Moment Generating Functions
Characteristic Functions
Poisson Distribution
Binomial Distribution
Uniform Distribution
Normal Distribution
MGF Properties
Function of Random Variables
Obtaining the PDFs
Moment Generating Functions
Characteristic Functions
Poisson Distribution
Binomial Distribution
Uniform Distribution
Normal Distribution
MGF Properties
Function of Random Variables
Obtaining the PDFs
Let there be a random variable \(X\) with a known distribution function \(F_X(x)\), the density function for the random variable \(Y=g(X)\) can be found with the following steps
Let \(X\) have the following probability density function:
\[ f_X(x)=\left\{\begin{array}{cc} 2x & 0\le x \le 1 \\ 0 & \mathrm{otherwise} \end{array} \right. \]
Find the probability density function of \(Y=3X-1\)?
Let there be a random variable \(X\) with a known distribution function \(F_X(x)\), if the random variable \(Y=g(X)\) is either increasing or decreasing, than the probability density function can be found as
\[ f_Y(y) = f_X\{g^{-1}(y)\}\left|\frac{dg^{-1}(y)}{dy}\right| \]
Let \(X\) have the following probability density function:
\[ f_X(x)=\left\{\begin{array}{cc} \frac{3}{2}x^2 + x & 0\le y \le 1 \\ 0 & \mathrm{otherwise} \end{array} \right. \]
Find the probability density function of \(Y=5-(X/2)\)?
Using the uniqueness property of Moment Generating Functions, for a random variable \(X\) with a known distribution function \(F_X(x)\) and random variable \(Y=g(X)\), the distribution of \(Y\) can be found by:
Let \(X\) follow a normal distribution with mean \(\mu\) and variance \(\sigma^2\). Find the distribution of \(Z=\frac{X-\mu}{\sigma}\).
Let \(Z\) follow a standard normal distribution with mean \(0\) and variance \(1\). Find the distribution of \(Y=Z^2\)