Fall 2011 Homework 10: Question 4
We wish to estimate an unknown parameter $\theta$, based on a r.v. X we will get to observe. As in the Bayesian perspective, assume that X and $\theta$ have a joint distribution. Let $\hat{\theta}$ be the estimator (which is a function of X). Then $\hat{\theta }$ is said to be unbiased if E($\hat{\theta }$|$\theta$) = $\theta$, and $\hat{\theta }$ is said to be the Bayes procedure if E($\theta$|X) = $\hat{\theta}$.
(a) Let $\hat{\theta }$ be unbiased. Find $E(\hat{\theta }-\theta )^{2}$ (the average squared difference between the estimator and the true value of $\theta$), in terms of marginal moments of $\hat{\theta }$ and $\theta$. Hint: condition on $\theta$.
(b) Repeat (a), except in this part suppose that $\hat{\theta }$ is the Bayes procedure rather than assuming that it is unbiased. Hint: condition on X.
(c) Show that it is impossible for $\hat{\theta }$ to be both the Bayes procedure and unbiased, except in silly problems where we get to know $\theta$ perfectly by observing X. Hint: if Y is a nonnegative r.v. with mean 0, then P(Y = 0) = 1.
Solution: (a.) E(hattedtheta^2) − E(theta^2). (b.) E(theta^2) − E(hattedtheta^2). (c.) Suppose that hattedtheta is both the Bayes procedure and unbiased. By the above, we have E(hattedtheta-theta)^2 = a and E(hattedtheta-theta)^2 = −a, where a = E(hattedtheta^2)-E(theta^2). But that implies a = 0, which means that hattedtheta = theta (with probability 1).
"Mathematics is the logic of certainty, but statistics is the logic of uncertainty."