Statistics
Sampling Distributions
Central Limit Theorem
When collecting data to construct a sample, the sample is a collection of random variables.
Therefore, the sample can be subjected to probability properties.
A sample of random variables are said to be iid if they are identical and independentally distributed.
For example, \(X\) and \(Y\) are iid, if \(X\) and \(Y\) has the same distribution \(f(\theta)\) and \(X \perp Y\)
A statistic is a transformation of the the sample data.
Before data is calculated, a statistic from a sample can take any value.
Therefore, a statistic must be a random variable.
Statistics
Sampling Distributions
Central Limit Theorem
A sampling distribution is the distribution of a statistic. Many known statistics have a known distribution.
Let \(X_1, X_2, \ldots, X_n\overset{iid}{\sim}N(\mu,\sigma^2)\) , show that \(\bar X \sim N(\mu,\sigma^2/n)\). Note: the MGF of \(X_i\) is \(e^{\mu t + \frac{t^2\sigma^2}{2}}\).
Let \(Z_1^2, \ldots, Z_n^2\) be a iid \(\chi^2_1\). Find \(Y = \sum^n_{i=1} Z_i^2\)
Let \(Z\sim N(0,1)\), \(W\sim \chi^2_\nu\), \(Z\perp W\); therefore:
\[ T=\frac{Z}{\sqrt{W/\nu}} \sim t_\nu \]
Let \(W_1\sim\chi^2_{\nu_1}\) \(W_2\sim\chi^2_{\nu_2}\), and \(W_1\perp W_2\); therefore:
\[ F = \frac{W_1/\nu_1}{W_2/\nu_2}\sim F_{\nu_1,\nu_2} \]
Statistics
Sampling Distributions
Central Limit Theorem
Let \(X_1, X_2, \ldots, X_n\) be identical and independent distributed random variables with \(E(X_i)=\mu\) and \(Var(X_i) = \sigma²\). We define
\[ Y_n = \sqrt n \left(\frac{\bar X-\mu}{\sigma}\right) \mathrm{ where }\ \bar X = x\frac{1}{n}\sum^n_{i=1}X_i. \]
Then, the distribution of the function \(Y_n\) converges to a standard normal distribution function as \(n\rightarrow \infty\).
\[ \bar X \sim N\left(\mu, \frac{\sigma^2}{n}\right) \]
Let \(X_1, \ldots, X_n \overset{iid}{\sim} \chi^2_p\), the MGF is \(M(t)=(1-2t)^{-p/2}\). Find the distribution of \(\bar X\) as \(n \rightarrow \infty\).