Abstract:
We consider the problem of testing the hypothesis that the parameter of linear regression model is $0$ against an $s$-sparse alternative separated from $0$ in the $\ell_2$-distance. We show that, in Gaussian linear regression model with $p < n$, where $p$ is the dimension of the parameter and $n$ is the sample size, the non-asymptotic minimax rate of testing has the form $ \sqrt {(s / n) \log (1 + \sqrt {p} / s)}$. We also show that this is the minimax rate of estimation of the $\ell_2$-norm of the regression parameter.
Keywords:linear regression, sparsity, signal detection.