An estimator is considered a consistent estimator of θ if the estimator, on average, converges to θ as n→∞.
Let X1,…,Xn be a random sample from a distribution with parameter θ. The estimator ˆθ is a consistent estimator of the θ if
Sufficiency evaluates whether a statistic (or estimator) contains enough information of a parameter θ. In essence a statistic is considered sufficient to infer θ if it provides enough information about θ.
Let X1,…,Xn be a random sample from a distribution with parameter θ. A statistic T=t(X1,…,Xn) is said to be sufficient for making inferences of a parameter θ if condition joint distribution of X1,…,Xn given T=t does not depend on θ.
Let X1,…,Xniid∼Bernoulli(p) and Yn=∑ni=1Xi. Show that Yn is a sufficient statistic for p.
In Statistics, information is thought of as how much does the data tell you about a parameter θ. In general, the more data is provided, the more information is provided to estimate θ.
Information can be quantified using Fisher’s Information I(θ). For a single observation, Fisher’s Information is defined as
I(θ)=E[−∂2∂θ2log{f(X;θ)}],
where f(X;θ) is either the PMF or PDF of the random variable X.
Furthermore, I(θ) can be defined as
I(θ)=Var{∂∂θlogf(X;θ)}.
Show the following property:
E[−∂2∂θ2log{f(X;θ)}]=Var{∂∂θlogf(X;θ)}
Efficiency of an estimator T is the ratio of variation compared to the lowest possible variance.
The efficiency of an estimator T, where T is an unbiased estimator of θ, is defined as
efficiency of T=1Var(T)nI(θ)
Let X1,…,Xniid∼Unif(0,θ) and ˆθ=2ˉX. Find the efficiency of ˆθ.