Abstract:
If $(X, Y)$ is an observation with distribution function $F(x-\theta,y)$, $\sigma^{2}=\textrm{var}(X)$, $\rho=\textrm{corr}(X,Y)$ and $I$ is the Fisher information on $\theta$ in $(X,Y)$, then $I\ge\{\sigma^2(1-\rho^2)\}^{-1}$. The equality sign holds under conditions closely related to the conditions for linearity of the Pitman estimator of $\theta$ from a sample from $F(x-\theta,y)$. The results are extensions of earlier results for the case when only the informative component $X$ is observed.