Review Random Variables
Review Discrete Random Variables
Review Continuous Random Variables
A random variable is function that maps the sample space to real value.
A random variable is considered to be discrete if it can only map to a finite or countably infinite number of distinct values.
The probability mass function of discrete variable can be represented by a formula, table, or a graph. The Probability of a random variable Y can be expressed as \(P(Y=y)\) for all values of \(y\).
The cumulative distribution function provides the \(P(Y\leq y)\) for a random variable \(Y\).
The expected value is the value we expect when we randomly sample from population that follows a specific distribution. The expected value of Y is
\[ E(Y)=\sum_y yP(y) \]
The variance is the expected squared difference between the random variable and expected value.
\[ Var(Y)=\sum_y\{y-E(Y)\}^2P(y) \]
\[ Var(Y) = E(X^2) - E(X)^2 \]
Distribution | Parameter(s) | PMF \(P(Y=y)\) |
---|---|---|
Bernoulli | \(p\) | \(p\) |
Binomial | \(n\) and \(p\) | \((^n_y)p^y(1-p)^{n-p}\) |
Geometric | \(p\) | \((1-p)^{y-1}p\) |
Negative Binomial | \(r\) and \(p\) | \((^{y-1}_{r-1})p^{r-1}(1-p)^{y-r}\) |
Hypergeometric | \(N\), \(n\), and \(r\) | \(\frac{(^r_y)(^{N-r}_{n-y})}{(^N_n)}\) |
Poisson | \(\lambda\) | \(\frac{\lambda^y}{y!} e^{-\lambda}\) |
An experiment is said to follow a binomial distribution if
\(P(X=x)=(^n_x)p^x(1-p)^{n-x}\)
The poisson distribution describes an experiment that measures that occurrence of an event at specific point and/or time period.
\(P(X=x)=\frac{\lambda^x}{x!}e^{-\lambda}\)
A random variable \(X\) is considered continuous if the \(P(X=x)\) does not exist.
The cumulative distribution function of \(X\) provides the \(P(X\leq x)\), denoted by \(F(x)\), for the domain of \(X\).
Properties of the CDF of \(X\):
The probability density function of the random variable \(X\) is given by
\[ f(x)=\frac{dF(x)}{d(x)}=F^\prime(x) \]
wherever the derivative exists.
Properties of pdfs:
The expected value for a continuous distribution is defined as
\[ E(X)=\int x f(x)dx \]
The expectation of a function \(g(X)\) is defined as
\[ E\{g(X)\}=\int g(x)f(x)dx \]
The variance of continuous variable is defined as
\[ Var(X) = E[\{X-E(X)\}^2] = \int \{X-E(X)\}^2 f(x)dx \]
A random variable is said to follow uniform distribution if the density function is constant between two parameters.
\[ f(x) = \left\{\begin{array}{cc} \frac{1}{b-a} & a \leq x \leq b\\ 0 & \mathrm{elsewhere} \end{array}\right. \]
A random variable is said to follow a normal distribution if the the frequency of occurrence follow a Gaussian function.
\[ f(x)=\frac{1}{\sqrt{2\pi \sigma^2}}\exp\left\{-\frac{(x-\mu)^2}{2\sigma^2}\right\} \]