O estimador imparcial de máxima verossimilhança é sempre o melhor avaliador imparcial?


22

Sei que, para problemas regulares, se temos o melhor estimador imparcial regular, ele deve ser o estimador de máxima verossimilhança (MLE). Mas, geralmente, se tivermos um MLE imparcial, também seria o melhor estimador imparcial (ou talvez eu deva chamá-lo de UMVUE, desde que tenha a menor variação)?


3
Interesting question. MLE is a function of the sufficient statistic, and UMVUEs can be obtained by conditioning on complete and sufficient statistics. So if MLE is unbiased (and a function of the sufficient statistic), the only way possible for it to not have minimum variance is if the sufficient statistic is not complete. I tried to find an example, but was unsuccessful.
Greenparker

2
And here is some brief information about sufficient and complete statistics.
Richard Hardy

10
The real issue is more that the MLE is rarely unbiased: if θ is the unbiased estimator of θ and the MLE of θ, f(θ^) is the MLE of f(θ) but is biased for most bijective transforms f.
Xi'an

1
Is this relevant? "An almost unbiased estimator of population mean" Vyas Dubey Pt.Ravishankar Shukla University, Raipur, India

2
+1 for Xi'ans comment. Best estimator means minimal variance, unbiased means something else. So I'm not sure that you can start trying to prove that, since one has little to do with the other. But before I'd even start my own derivation, I'd like to see some serious effort in the (try of a) proof. I'd say that even the proof of the first statement (MLE is optimal for certain cases) is not trivial.
cherub

Respostas:


13

In my opinion, the question is not truly coherent in that the maximisation of a likelihood and unbiasedness do not get along, if only because maximum likelihood estimators are equivariant, ie the transform of the estimator is the estimator of the transform of the parameter, while unbiasedness does not stand under non-linear transforms. Therefore, maximum likelihood estimators are almost never unbiased, if "almost" is considered over the range of all possible parametrisations.

However, there is an more direct answer to the question: when considering the estimation of the Normal variance, σ2, the UMVUE of σ2 is

σ^n2=1n1i=1n{xix¯n}2
while the MLE of σ2 is
σˇn2=1ni=1n{xix¯n}2
Ergo, they differ. This implies that

if we have a best regular unbiased estimator, it must be the maximum likelihood estimator (MLE).

does not hold in general.

Note further that, even when there exist unbiased estimators of a parameter θ, there is no necessarily a best unbiased minimum variance estimator (UNMVUE).


So can we say that an unbiased MLE it is a (U)MVUE, but not every (U)MVUE is the MLE?
Sextus Empiricus

2
No, we have no reason to believe this is true in general.
Xi'an

13

But generally, if we have an unbiased MLE, would it also be the best unbiased estimator ?

If there is a complete sufficient statistics, yes.

Proof:

  • Lehmann–Scheffé theorem: Any unbiased estimator that is a function of a complete sufficient statistics is the best (UMVUE).
  • MLE is a function of any sufficient statistics. See 4.2.3 here;

Thus an unbiased MLE is necesserely the best as long as a complete sufficient statistics exists.

But actually this result has almost no case of application since a complete sufficient statistics almost never exists. It is because complete sufficient statistics exist (essentially) only for exponential families where the MLE is most often biased (except location parameter of Gaussians).

So the real answer is actually no.

A general counter example can be given: any location family with likelihood pθ(x)=p(xθ) with p symmetric around 0 (tRp(t)=p(t)). With sample size n, the following holds:

  • the MLE is unbiased
  • it is dominated by another unbiased estimator know as Pitman's equivariant estimator

Most often the domination is strict thus the MLE is not even admissible. It was proven when p is Cauchy but I guess it's a general fact. Thus MLE can't be UMVU. Actually, for these families it's known that, with mild conditions, there is never an UMVUE. The example was studied in this question with references and a few proofs.


Why doesn't this have the highest upvotes? I felt this answer was better than Xian's.
Red Floyd

0

MLE's asymptotic variance is UMVUE i.e attains cramer rao lower bound but finite variance may not be UMVUE to make sure that estimator is UMVUE it should be sufficient and complete statistics or any function of that statistics.


0

In short, an estimator is UMVUE, if it is unbiased and the function of a complete and sufficient statistic. (See Rao-Blackwell and Scheffe)


Which means this is restricted to exponential families.
Xi'an
Ao utilizar nosso site, você reconhece que leu e compreendeu nossa Política de Cookies e nossa Política de Privacidade.
Licensed under cc by-sa 3.0 with attribution required.