RUS  ENG
Full version
JOURNALS // Matematicheskie Trudy // Archive

Mat. Tr., 1998 Volume 1, Number 2, Pages 198–208 (Mi mt144)

Coding Low-Entropy Markov Sources with Unknown Statistics

M. P. Sharova

Sobolev Institute of Mathematics, Siberian Branch of the Russian Academy of Sciences

Abstract: The problem is considered of coding information sources with small entropy. This problem is well known in information theory, since, for these sources, there exist simpler coding methods than for arbitrary sources. However, the available methods of coding low-entropy sources do not allow us to construct codes with preassigned redundancy. In [1, 2], a new method of coding low-entropy sources is proposed which allows us to construct codes with any preassigned redundancy. On the basis of the code construction of [1, 2], this article suggests a universal code for a low-entropy Markov source generating letters of the binary alphabet $A=\{0,1\}$ with unknown conditional probabilities.

Key words: Markov source, redundancy, entropy of a source, memory size of an encoder and decoder, average time of coding and decoding.

UDC: 519.722+519.723

Received: 04.06.1998


 English version:
Siberian Advances in Mathematics, 1999, 9:2, 72–82

Bibliographic databases:


© Steklov Math. Inst. of RAS, 2026