   Fall 2011 Homework 10: Question 4
We wish to estimate an unknown parameter , based on a r.v. X we will get to observe. As in the Bayesian perspective, assume that X and have a joint distribution. Let be the estimator (which is a function of X). Then is said to be unbiased if E(|) = , and is said to be the Bayes procedure if E(|X) = .
(a) Let be unbiased. Find (the average squared difference between the estimator and the true value of ), in terms of marginal moments of and . Hint: condition on .
(b) Repeat (a), except in this part suppose that is the Bayes procedure rather than assuming that it is unbiased. Hint: condition on X.
(c) Show that it is impossible for to be both the Bayes procedure and unbiased, except in silly problems where we get to know perfectly by observing X. Hint: if Y is a nonnegative r.v. with mean 0, then P(Y = 0) = 1.
Solution: (a.) E(hattedtheta^2) − E(theta^2). (b.) E(theta^2) − E(hattedtheta^2). (c.) Suppose that hattedtheta is both the Bayes procedure and unbiased. By the above, we have E(hattedtheta-theta)^2 = a and E(hattedtheta-theta)^2 = −a, where a = E(hattedtheta^2)-E(theta^2). But that implies a = 0, which means that hattedtheta = theta (with probability 1).
"Mathematics is the logic of certainty, but statistics is the logic of uncertainty." 