Abstract:
A nonlinear regression model $x_t=g_t(\theta_0)+\varepsilon_t$, $t\geqslant1$, is considered. Under a number of conditions on its elements $\varepsilon_t$ and $g_t(\theta_0)$ it is proved that the distribution of the normalized least square estimate of the parameter $\theta_0$ converges uniformly on the real axis to the standard normal law at least as quickly as a quantity of the order $T^{-1/2}$ as $T\to\infty$, where $T$ is the size of the sample, by which the estimate is formed.