Central Limit Theorem
Order Statistics
Estimators
Let \(X_1, X_2, \ldots, X_n\) be identical and independent distributed random variables with \(E(X_i)=\mu\) and \(Var(X_i) = \sigma²\). We define
\[ Y_n = \sqrt n \left(\frac{\bar X-\mu}{\sigma}\right) \mathrm{ where }\ \bar X = x\frac{1}{n}\sum^n_{i=1}X_i. \]
Then, the distribution of the function \(Y_n\) converges to a standard normal distribution function as \(n\rightarrow \infty\).
\[ \bar X \sim N\left(\mu, \frac{\sigma^2}{n}\right) \]
Let \(X_1, \ldots, X_n \overset{iid}{\sim} \chi^2_p\), the MGF is \(M(t)=(1-2t)^{-p/2}\). Find the distribution of \(\bar X\) as \(n \rightarrow \infty\).
Central Limit Theorem
Order Statistics
Estimators
Order statistics are a fundamental concept in statistics and probability, dealing with the properties of sorted random variables. They provide insights into the distribution and behavior of sample data, such as minimum, maximum, and quantiles. Understanding order statistics is crucial in various fields such as risk management, quality control, and data analysis.
Let \(X_1, X_2, \ldots, X_n\) be a sample of \(n\) independent and identically distributed (i.i.d.) random variables with a common probability density function \(f(x)\). The order statistics are the sorted values of this sample, denoted as:
\[ X_{(1)} \leq X_{(2)} \leq \cdots \leq X_{(n)} \]
Here, \(X_{(1)}\) is the minimum, and \(X_{(n)}\) is the maximum of the sample.
The distribution of the \(k\)-th order statistic \(X_{(k)}\) can be derived using combinatorial arguments. Its PDF is given by: \[ f_{X_{(k)}}(x) = \frac{n!}{(k-1)!(n-k)!} [F(x)]^{k-1} [1 - F(x)]^{n-k} f(x) \]
This formula shows how the distribution of \(X_{(k)}\) depends on the underlying distribution of the sample and its position \(k\).
Central Limit Theorem
Order Statistics
Estimators
An estimator is an operation computing the value of an estimate, that targets the parameter, using measurements from a sample.
An unbiased estimator \(\hat\theta\) is an estimator that satisfies the following condition:
\[ E(\hat\theta) = \theta \]
The bias of an estimator is defined as
\[ B(\hat\theta) = E(\hat\theta)-\theta \]
The mean square of an estimator is \(\hat\theta\) is given as
\[ \begin{eqnarray} MSE(\hat\theta) & = & E\{(\hat\theta-\theta)^2\} \\ & = & Var(\hat\theta) + B(\hat\theta)^2 \end{eqnarray} \]
Let \(X_1,\ldots,X_n\overset{iid}{\sim}N(\mu,\sigma^2)\), find the bias of \(\bar X\).
Let \(X_1,\ldots,X_n\overset{iid}{\sim}N(\mu,\sigma^2)\), find the bias of \(S²\).
Let \(X_1,X_2,X_3\) follow and exponential distribution with mean and variance \(\lambda\) and \(\lambda²\), respectively. Using the following estimators:
\(\hat\theta_1 = X_1\)
\(\hat\theta_2 = \frac{X_1+X_2}{2}\)
\(\hat\theta_3 = \frac{X_1+2X_2}{3}\)
\(\hat\theta_4 = \frac{X_1+X_2+X_3}{3}\)
Identify which estimator