aBasic Research of Artificial Intelligence Laboratory (BRAIn Lab), Ìîñêâà, Ðîññèÿ bMoscow Center for Advanced Studies, Ìîñêâà, Ðîññèÿ cFederated Learning Problems Laboratory, Ìîñêâà, Ðîññèÿ dInnopolis University
Abstract:
This paper investigates stochastic optimization problems under Markovian noise and the strong growth condition, motivated by overparameterized ML models. We present an improved analysis of the Accelerated Gradient Descent algorithm from [1] in the strongly convex case, showing that in low-noise regimes, the effect of Markovianity can be ignored. Furthermore, we derive the first lower bound that simultaneously depends on the Markov chain’s mixing time and the problem’s noise level, establishing the near-optimality of our results.