







Study with the several resources on Docsity
Earn points by helping other students or get them with a premium plan
Prepare for your exams
Study with the several resources on Docsity
Earn points to download
Earn points by helping other students or get them with a premium plan
Community
Ask the community for help and clear up your study doubts
Discover the best universities in your country according to Docsity users
Free resources
Download our free guides on studying techniques, anxiety management strategies, and thesis advice from Docsity tutors
A lecture note from stat 9220 at medical college of georgia, covering the topic of unbiased estimation, specifically focusing on uniformly minimum variance unbiased estimators (umvue) and the lehmann-scheffe theorem. Examples of finding umvue for different distributions, such as uniform, poisson, exponential, and normal distributions.
Typology: Study notes
1 / 13
This page cannot be seen from the preview
Don't miss anything!
Unbiased or asymptotically unbiased estimation plays an important role in point
estimation theory.
Unbiased estimators can be used as building blocks for the construction of better
estima- tors.
Asymptotic unbiasedness is necessary for consistency. Thus we need to learn:
Definition 8.1.1. Let X be a sample from an unknown population P ∈ P. Let
ϑ be a real-valued parameter related to P. An estimator T (X) of ϑ is unbiased if
and only if E[T (X)] = ϑ for any P ∈ P. If there exists an unbiased estimator of
ϑ, then ϑ is called an estimable parameter.
Definition 8.1.2. An unbiased estimator T (X) of ϑ is called the uniformly mini-
mum variance unbiased estimator (UMVUE) if and only if V ar (T (X)) ≤ V ar (U (X))
for any P ∈ P and any other unbiased estimator U (X) of ϑ.
Note: M SE [T (X)] = (ϑ − T (X))
2
= V ar T (X)
There exist in essence two basic ways of finding UMVUE when we have the
complete sufficient statistic T.
Example 8.1.1. (Solving for h)
(1) Let X 1 ,... , Xn be i.i.d. from the uniform distribution on (0, θ), θ > 0. The
sufficient and complete statistic X (n)
has the Lebesgue p.d.f. nθ
−n x
n− 1 1 (0,θ)
(x).
Let g(θ) be any differentiable function of θ. An unbiased estimator h(X (n)
) of θ
must satisfy
θ
n
g(θ) = n
θ ∫
0
h(x)x
n− 1
dx, θ > 0.
Then
nθ
n− 1
g(θ) + θ
n
g
′
(θ) = nh(θ)θ
n− 1
,
and
g(θ) +
θ
n
g
′
(θ) = h(θ).
Hence, h(x) = g(x) +
x
n
g
′ (x). And if g(x) = x, then h(x) = x(1 +
1
n
(2) Let X 1
n
be i.i.d. from the Poisson distribution P (θ) with an un-
known θ > 0. Then T (X) =
n
i=
i
is sufficient and complete for θ and has
the Poisson distribution P (nθ). Suppose that ϑ = g(θ), where g is an analytic
function, i.e. g(x) =
∞
j=
aj x
j
, x > 0. An unbiased estimator h(T ) of ϑ must
satisfy
∞ ∑
t=
h(t)n
t
θ
t
t!
nθ
g(θ) =
∞ ∑
k=
n
k
k!
θ
k
∞ ∑
j=
a j
θ
j
∞ ∑
t=
j,k:j+k=t
n
k
k!
a j
θ
t
for any θ > 0. Thus, a comparison of coefficients in front of θ
t leads to
h(t) =
t!
n
t
j,k:j+k=t
n
k
aj
k!
i.e., h(T ) is the UMVUE of ϑ.
Hence
1
/n
X ∼ (n − 1)(1 − x)
n− 2
1 (0,1)
(x)
and
P (X 1 > t|
X = ¯x) = (n − 1)
1 ∫
t/(n¯x)
(1 − x)
n− 2
dx =
t
nx¯
n− 1
do the UMVUE of ϑ is T (X) = (1 −
t
nx¯
n− 1
. Note that (1 −
t
n¯x
n− 1 ≈ e
−t/x¯ .
Example 8.1.3. (Normal distribution)
Let X 1
n
be i.i.d. from N (μ, σ
2
) with unknown μ ∈ R and σ
2
2 ) is sufficient and complete for θ = (μ, σ
2 ). Moreover
2 are in-
dependent;
n(
X − μ)/σ ∼ N (0, 1); S
2
∼ X
2
n− 1
(1) Moments
X is UMVUE for μ;
2 − S
2 /n is the UMVUE for μ
2 ; Since E
S
2
σ
2
)r
2
= k n− 1 ,r
(text p.164), where
kn,r =
n
r/ 2
Γ(n/2)
r/ 2 Γ(
n+r
2
hence k n− 1 ,r
r is the UMVUE for σ
r .
(2) Quantiles
Suppose that ϑ satisfies P (X 1
≤ ϑ) = p with a fixed p ∈ (0, 1). Then p = Φ(
ϑ−μ
σ
and ϑ = μ + σΦ
− 1
(p). Hence
X + k n− 1 , 1
− 1
(p) is the UMVUE for ϑ.
(3) Probability of exceedance
Let c be a fixed constant and ϑ = P (X 1
≤ c) = Φ(
c−μ
σ
). Since δ(x) = 1 (−∞,c)
is an unbiased estimator of ϑ, the UMVUE of ϑ is E[δ(X)|T ] = P (X 1
≤ c|T ).
By Basu’s theorem, the ancillary statistic Z(X) = (X(1) −
X)/S is independent of
2 ). Then
1
≤ c|T ) = P
1
c −
T = t
1
c −
c −
Distribution of Z is available with density f (z) (text, p.165), hence the UMVUE
of ϑ is
h(T ) =
(c−
¯ X)/S ∫
−(n−1)/
√
n
f (z)dz = P (X 1
≤ c|T ) (8.1)
(4) Normal density
Suppose that we would like to estimate ϑ(x) =
1
σ
ϕ
x−μ
σ
(density at x = c). By
(8.1) the conditional p.d.f. of X 1
|T i.e. ϑ(x|T ) is ϑ(x|T ) =
1
s
f (
x−x¯
s
). Let g be the
joint p.d.f. of T = (
2 ). Then
ϑ(x) =
ψ(x, t)dt =
ϑ(x|t)g(t)dt = E
f
c −
Thus ϑ(x|T ) is the UMVUE of ϑ. (ψ(x, t) is the joint density of X and T ).
Another way of showing this is as follows. Note that the Lebesgue p.d.f. of
(1)
is
nθ
n
x
n+
(θ,∞)
(x).
If θ < t,
E[h(X (1)
∞
θ
h(x)
nθ
n
x
n+
dx
t
θ
(n − 1)x
nt
nθ
n
x
n+
dx +
∞
t
nθ
n
x
n+
dx
θ
n
tθ
n− 1
θ
n
t
n
θ
n
t
n
θ
t
= P (X 1 > t).
If θ ≥ t, then P (X 1
t) = 1 and h(X (1)
) = 1 a.s. P θ
since P (t > X (1)
Hence, for any θ > 0 ,
E[h(X (1)
1
t).
Since h(X (1)
) is a function of complete sufficient statistic the result follows.
When a complete and sufficient statistic is not available, it is usually very difficult
to derive a UMVUE. In some cases, the following result can be applied.
Theorem 8.2.1. Let U be the set of all unbiased estimators of 0 with finite vari-
ances and T be an unbiased estimator of ϑ with E(T
2 ) < ∞.
(i) A necessary and sufficient condition for T (X) to be a UMVUE of ϑ is that
E[T (X)U (X)] = 0 for any U ∈ U and any P ∈ P.
(ii) Suppose that T = h(
T ), where
T is a sufficient statistic for P ∈ P and h is
a Borel function. Let U (^) ˜ T
be the subset of U consisting of Borel functions of
T. Then a necessary and sufficient condition for T to be a UMVUE of ϑ is
that E[T (X)U (X)] = 0 for any U ∈ U (^) ˜ T
and any P ∈ P.
Remark 8.2.1. The above result may be useful in order to
If there is a sufficient statistic, then by Rao-Blackwell’s theorem, we only need
to focus on functions of the sufficient statistic and, hence, Theorem 8.2.1(ii) above
is more convenient to use.
As a consequence of Theorem 8.2.1, we have the following useful result.
Corollary 8.2.1. (i) Let Tj be a UMVUE of ϑj , j = 1,... , k, where k is a fixed
positive integer. Then
k
j=
c j
j
is a UMVUE of ϑ =
k
j=
c j
ϑ j
for any constants
c 1
,... , c k
. (ii) Let T 1
and T 2
be two UMVUE’s of ϑ. Then T 1
2
a.s. P for any
Proof. For instance for (ii): T 1 − T 2 is an estimate of zero so
1
2
2
= ET 1
1
2
2
1
2
Hence UMVUE’s are, in essence, unique (i.e., unique a.s. P).