Abstract:
Let $\{X_i\}_{-\infty}^\infty$ be a sequence of random variables, $E(X_i)\equiv0$. If $\nu\ge1$, estimates for the $\nu$-th moments $\max _{1\le k\le n}\bigl|\sum_{a+1}^{a+k}X_i\bigr|$ can be derived from known estimates $\bigl|\sum_{a+1}^{a+n}X_i\bigr|$ of the $\nu$-th moment. Here we generalized the Men'shov–Rademacher inequality for $\nu=2$ for orthonormal $X_i$, to the case $\nu\ge1$ and dependent random variables. The Men'shov–Payley (inequality $\nu>2$ for orthonormal $X_i$) is generalized for $\nu>2$ to general random variables. A theorem is also proved that contains both the Erdös–Stechkin theorem and Serfling's theorem with $\nu>2$ for dependent random variables.