RUS  ENG
Full version
PEOPLE

Alkousa Mohammad Soud

Publications in Math-Net.Ru

  1. Intermediate gradient methods with relative inexactness

    J. Optim. Theory Appl., 207 (2025),  62–42
  2. Accelerated Bregman gradient methods for relatively smooth and relatively Lipschitz continuous minimization problems

    Uspekhi Mat. Nauk, 80:6(486) (2025),  137–172
  3. Adaptive primal-dual methods with an inexact oracle for relatively smooth optimization problems and their applications to recovering low-rank matrices

    Zh. Vychisl. Mat. Mat. Fiz., 65:7 (2025),  1156–1177
  4. On some mirror descent methods for strongly convex programming problems with Lipschitz functional constraints

    Computer Research and Modeling, 16:7 (2024),  1727–1746
  5. Subgradient methods with B.T. Polyak-type step for quasiconvex minimization problems with inequality constraints and analogs of the sharp minimum

    Computer Research and Modeling, 16:1 (2024),  105–122
  6. On Quasi-Convex Smooth Optimization Problems by a Comparison Oracle

    Rus. J. Nonlin. Dyn., 20:5 (2024),  813–825
  7. Mirror Descent Methods with a Weighting Scheme for Outputs for Optimization Problems with Functional Constraints

    Rus. J. Nonlin. Dyn., 20:5 (2024),  727–745
  8. Analogues of the relative strong convexity condition for relatively smooth problems and adaptive gradient-type methods

    Computer Research and Modeling, 15:2 (2023),  413–432
  9. Subgradient methods for weakly convex and relatively weakly convex problems with a sharp minimum

    Computer Research and Modeling, 15:2 (2023),  393–412
  10. Solving strongly convex-concave composite saddle-point problems with low dimension of one group of variable

    Mat. Sb., 214:3 (2023),  3–53
  11. Adaptive Subgradient Methods for Mathematical Programming Problems with Quasiconvex Functions

    Trudy Inst. Mat. i Mekh. UrO RAN, 29:3 (2023),  7–25
  12. Subgradient methods for non-smooth optimization problems with some relaxation of sharp minimum

    Computer Research and Modeling, 14:2 (2022),  473–495
  13. Adaptive first-order methods for relatively strongly convex optimization problems

    Computer Research and Modeling, 14:2 (2022),  445–472
  14. An approach for the nonconvex uniformly concave structured saddle point problem

    Computer Research and Modeling, 14:2 (2022),  225–237
  15. Numerical Methods for Some Classes of Variational Inequalities with Relatively Strongly Monotone Operators

    Mat. Zametki, 112:6 (2022),  879–894
  16. Solving convex min-min problems with smoothness and strong convexity in one group of variables and low dimension in the other

    Avtomat. i Telemekh., 2021, no. 10,  60–75
  17. Accelerated methods for saddle-point problem

    Zh. Vychisl. Mat. Mat. Fiz., 60:11 (2020),  1843–1866
  18. On some stochastic mirror descent methods for constrained online optimization problems

    Computer Research and Modeling, 11:2 (2019),  205–217
  19. Adaptive mirror descent algorithms for convex and strongly convex optimization problems with functional constraints

    Diskretn. Anal. Issled. Oper., 26:3 (2019),  88–114
  20. Adaptive mirror descent algorithms in convex programming problems with Lipschitz constraints

    Trudy Inst. Mat. i Mekh. UrO RAN, 24:2 (2018),  266–279


© Steklov Math. Inst. of RAS, 2026