by RAHUl JHa
Last Updated January 14, 2018 12:19 PM

Let we have $X_{1},X_{2},..,X_{n}$ as independent and identically distributed random variable from $U(\theta,\theta+1)$. Clearly, the maximum likelihood estimator of such distribution will be $[X_{(n)}-1,X_{(1)}]$. I know that maximum likelihood estimator follows the invariance property. Can any function from these two estimators will be maximum likelihood? Then, why not $\frac{X_{(1)}+X_{(n)}}{2}$ will be maximum likelihood estimator since they are functions of maximum likelihood estimators.

There is no unique MLE. So perhaps it is a matter of semantics whether the invariance property applies to quantities from the interval $[X_{(n)}-1,X_{(1)}]$ and perhaps difficult to say what invariance would mean.

For $n \ge 2,$ your proposal $T_1 = (X_{(1)}+X_{(n)})/2$ is severely biased; $E(T_1) = \theta + \frac 12.$ The midpoint $T_2$ of the interval is unbiased. (I suspect it may be what you meant to propose.) You might want to explore whether $T_2$ has smaller variance than other unbiased linear functions of the interval endpoints?

- ServerfaultXchanger
- SuperuserXchanger
- UbuntuXchanger
- WebappsXchanger
- WebmastersXchanger
- ProgrammersXchanger
- DbaXchanger
- DrupalXchanger
- WordpressXchanger
- MagentoXchanger
- JoomlaXchanger
- AndroidXchanger
- AppleXchanger
- GameXchanger
- GamingXchanger
- BlenderXchanger
- UxXchanger
- CookingXchanger
- PhotoXchanger
- StatsXchanger
- MathXchanger
- DiyXchanger
- GisXchanger
- TexXchanger
- MetaXchanger
- ElectronicsXchanger
- StackoverflowXchanger
- BitcoinXchanger
- EthereumXcanger