Abstract:
A class of queueing systems fed by an input containing linear deterministic component and a random component described by a centered Gaussian process is considered. The variance of the input is a regularly varying at infinity function with exponent $0<V<2$. The conditions are found under which the maximum of stationary workload (remaining work) over time interval $[0,\,t]$ converges in the space $L_p$ as $t\rightarrow\infty$ (and under an appropriate scaling) to an explicitly given constant $a$. Asymptotics of the workload maximum in nonstationary regime is also given. The asymptotics of the hitting time of an increasing value $b$ by the workload process is obtained.