RUS  ENG
Full version
JOURNALS // Doklady Rossijskoj Akademii Nauk. Mathematika, Informatika, Processy Upravlenia // Archive

Dokl. RAN. Math. Inf. Proc. Upr., 2025 Volume 527, Pages 415–431 (Mi danma698)

SPECIAL ISSUE: ARTIFICIAL INTELLIGENCE AND MACHINE LEARNING TECHNOLOGIES

Extrasaga: variance reduction hybrid method for variational inequalities

G. Chirkovab, Yu. Kabikovab, D. Medyakovabc, G. Molodtsovabcd, A. Shestakovab, A. Beznosikovabcd

a Basic Research of Artificial Intelligence Laboratory (BRAIn Lab), Moscow, Russia
b Moscow Center for Advanced Studies, Moscow, Russia
c Federated Learning Problems Laboratory, Moscow, Russia
d Innopolis University

Abstract: Variational inequalities (VIs) serve as a powerful tool for various problems. This setting can be applied to a wide range of optimization, machine learning (ML), and other challenges. At the same time, large volumes of data are essential for high performance in ML tasks, which are addressed through stochastic approaches. However, widely used SGD method suffers from a non-decreasing variance of the stochastic gradient. To resolve this issue, variance reduction techniques were developed. This approach is well-studied for minimization but less extensively for VIs. In this paper, we modify the SAGA method, known for its effectiveness in stochastic minimization, by integrating Extragradient to address VI problems. We provide a theoretical analysis of the proposed method and conduct experiments, including bilinear problems and image denoising tasks.

Keywords: variational inequalities, variance reduction, stochastic optimization, extragradient, SAGA.

UDC: 004.9

Received: 22.08.2025
Accepted: 29.09.2025

DOI: 10.7868/S2686954325070367



Bibliographic databases:


© Steklov Math. Inst. of RAS, 2026