This article is cited in
1 paper
Existence of an unbiased entropy estimator for the special Bernoulli measure
E. A. Timofeev P.G. Demidov Yaroslavl State University,
14 Sovetskaya str., Yaroslavl 150003, Russia
Abstract:
Let
$\Omega = {\mathcal A}^{{\mathbb N}}$ be a space of right-sided infinite sequences
drawn from a finite alphabet
${\mathcal A} = \{0,1\}$,
${\mathbb N} = \{1,2,\dots \} $,
$$
\rho(\boldsymbol{x},\boldsymbol{y}) =
\sum_{k=1}^{\infty}|x_{k} - y_{k}|2^{-k}
$$
a metric on
$\Omega = {\mathcal A}^{{\mathbb N}}$,
and
$\mu$ is a probability measure on
$\Omega$.
Let
$\boldsymbol{\xi_0}, \boldsymbol{\xi_1}, \dots, \boldsymbol{\xi_n}$
be independent identically distributed points on
$\Omega$.
We study the estimator
$\eta_n^{(k)}(\gamma)$ of the reciprocal of the entropy
$1/h$ that are defined as
$$
\eta_n^{(k)}(\gamma) = k \left(r_{n}^{(k)}(\gamma) - r_{n}^{(k+1)}(\gamma)\right),
$$
where
$$
r_n^{(k)}(\gamma) =
\frac{1}{n+1}\sum_{j=0}^{n} \gamma\left(\min_{i:i \neq j} {^{(k)}}
\rho(\boldsymbol{\xi_{i}}, \boldsymbol{\xi_{j}})\right),
$$
$\min ^{(k)}\{X_1,\dots,X_N\}= X_k$, if
$X_1\leq X_2\leq \dots\leq X_N$.
The number
$k$ and the function
$\gamma(t)$ are auxiliary parameters.
The main result of this paper is
Theorem.
Let
$\mu$ be the Bernoulli measure with probabilities
$p_0,p_1>0$,
$p_0+p_1=1$,
$p_0=p_1^2$.
There exists a function
$\gamma(t)$ such that
$$
\mathsf{E}\eta_n^{(k)}(\gamma) = \frac1h.
$$
Keywords:
measure, metric, entropy, estimator, unbias, self-similar, Bernoulli measure.
UDC:
519.17 Received: 10.07.2017
DOI:
10.18255/1818-1015-2017-5-521-536