Abstract:
Let $\{X_n\}$$(n=1,2,\dots)$ be a sequence of independent random variables having zero means and finite variances. Let us denote
\begin{gather*}
S_n=\sum_{j=1}^nX_j,\quad B_n=\sum_{j=1}^n\mathbf E(X_j^2),
\\
R_n=\sup_{-\infty<x<\infty}\biggl|\mathbf P(S_n<x\sqrt{B_n})-\frac1{\sqrt{2\pi}}\int_{-\infty}^xe^{-t^2/2}\,dt\biggr|.
\end{gather*}
The following theorem is proved.
Theorem 1. {\it Suppose that
\begin{gather*}
B_n\to\infty,\quad\frac{B_{n+1}}{B_n}\to1,
\\
R_n=O\biggl(\frac1{(\ln B_n)^{1+\delta}}\biggr)\quad\text{for some }\delta>0.
\end{gather*}
Then
$$
\mathbf P\biggl(\limsup\frac{S_n}{(2B_n\ln\ln B_n)^{1/2}}=1\biggr)=1.
$$ }