RUS  ENG
Full version
JOURNALS // Vestnik Tomskogo Gosudarstvennogo Universiteta. Matematika i Mekhanika // Archive

Vestn. Tomsk. Gos. Univ. Mat. Mekh., 2011 Number 4(16), Pages 6–17 (Mi vtgu217)

This article is cited in 3 papers

MATHEMATICS

The James–Stein procedure for a conditionally Gaussian regression

E. A. Pchelintsevab

a Tomsk State University
b Laboratoire de Mathématiques Raphaël Salem, Université de Rouen (France)

Abstract: The paper considers the problem of estimating a $p$-dimensional ($p\ge2$) mean vector of a multivariate conditionally normal distribution under quadratic loss. The problem of this type arises when estimating the parameters in a continuous time regression model with a non-Gaussian Ornstein–Uhlenbeck process. We propose a modification of the James–Stein procedure of the form $\theta^*(Y)=(1-c/\|Y\|)Y$, where $Y$ is an observation and $c>0$ is a special constant. This estimate allows one to derive an explicit upper bound for the quadratic risk and has a significantly smaller risk than the usual maximum likelihood estimator for the dimensions $p\ge2$. This procedure is applied to the problem of parametric estimation in a continuous time conditionally Gaussian regression model and to that of estimating the mean vector of a multivariate normal distribution when the covariance matrix is unknown and depends on some nuisance parameters.

Keywords: conditionally Gaussian regression model, improved estimation, James–Stein procedure, non-Gaussian Ornstein–Uhlenbeck process.

UDC: 517.16+519.2

Received: 19.07.2011



© Steklov Math. Inst. of RAS, 2026