RUS  ENG
Full version
PEOPLE

Beznosikov Aleksandr Nikolaevich

Publications in Math-Net.Ru

  1. Optimization with Markovian noise: towards optimal rates in strong growth case

    Dokl. RAN. Math. Inf. Proc. Upr., 527 (2025),  523–532
  2. Extrasaga: variance reduction hybrid method for variational inequalities

    Dokl. RAN. Math. Inf. Proc. Upr., 527 (2025),  415–431
  3. Sampling of semi-orthogonal matrices for the Muon algorithm

    Dokl. RAN. Math. Inf. Proc. Upr., 527 (2025),  217–228
  4. Communication-efficient solution of distributed variational inequalities using biased compression, data similarity and local updates

    Computer Research and Modeling, 16:7 (2024),  1813–1827
  5. Zero order algorithm for decentralised optimization problems

    Dokl. RAN. Math. Inf. Proc. Upr., 520:2 (2024),  295–312
  6. Local methods with adaptivity via scaling

    Uspekhi Mat. Nauk, 79:6(480) (2024),  117–158
  7. Local SGD for near-quadratic problems: Improving convergence under unconstrained noise conditions

    Uspekhi Mat. Nauk, 79:6(480) (2024),  83–116
  8. Accelerated Stochastic ExtraGradient: Mixing Hessian and gradient similarity to reduce communication in distributed and federated learning

    Uspekhi Mat. Nauk, 79:6(480) (2024),  5–38
  9. Effective method with compression for distributed and federated cocoercive variational inequalities

    Proceedings of ISP RAS, 36:5 (2024),  93–108
  10. On some works of Boris Teodorovich Polyak on the convergence of gradient methods and their development

    Zh. Vychisl. Mat. Mat. Fiz., 64:4 (2024),  587–626
  11. Optimal data splitting in distributed optimization for machine learning

    Dokl. RAN. Math. Inf. Proc. Upr., 514:2 (2023),  343–354
  12. Optimal analysis of method with batching for monotone stochastic finite-sum variational inequalities

    Dokl. RAN. Math. Inf. Proc. Upr., 514:2 (2023),  212–224
  13. Activations and gradients compression for model-parallel training

    Dokl. RAN. Math. Inf. Proc. Upr., 514:2 (2023),  126–137
  14. A unified analysis of variational inequality methods: variance reduction, sampling, quantization, and coordinate descent

    Zh. Vychisl. Mat. Mat. Fiz., 63:2 (2023),  189–217
  15. Linearly convergent gradient-free methods for minimization of parabolic approximation

    Computer Research and Modeling, 14:2 (2022),  239–255

  16. Defending against Byzantine attacks by trust-based weighting of agents

    Uspekhi Mat. Nauk, 80:6(486) (2025),  191–194


© Steklov Math. Inst. of RAS, 2026