|
|
Publications in Math-Net.Ru
-
Intermediate gradient methods with relative inexactness
J. Optim. Theory Appl., 207 (2025), 62–42
-
Accelerated Bregman gradient methods for relatively smooth and relatively Lipschitz continuous minimization problems
Uspekhi Mat. Nauk, 80:6(486) (2025), 137–172
-
Adaptive primal-dual methods with an inexact oracle for relatively smooth optimization problems and their applications to recovering low-rank matrices
Zh. Vychisl. Mat. Mat. Fiz., 65:7 (2025), 1156–1177
-
An adaptive variant of the Frank–Wolfe method for relative smooth convex optimization problems
Zh. Vychisl. Mat. Mat. Fiz., 65:3 (2025), 364–375
-
Subgradient methods for weakly convex problems with a sharp minimum in the case of inexact information about the function or subgradient
Computer Research and Modeling, 16:7 (2024), 1765–1778
-
On some mirror descent methods for strongly convex programming problems with Lipschitz functional constraints
Computer Research and Modeling, 16:7 (2024), 1727–1746
-
Subgradient methods with B.T. Polyak-type step for quasiconvex minimization problems with inequality constraints and analogs of the sharp minimum
Computer Research and Modeling, 16:1 (2024), 105–122
-
On Quasi-Convex Smooth Optimization Problems by a Comparison Oracle
Rus. J. Nonlin. Dyn., 20:5 (2024), 813–825
-
Mirror Descent Methods with a Weighting Scheme for Outputs for Optimization Problems with Functional Constraints
Rus. J. Nonlin. Dyn., 20:5 (2024), 727–745
-
Highly smooth zeroth-order methods for solving optimization problems under the PL condition
Zh. Vychisl. Mat. Mat. Fiz., 64:4 (2024), 739–770
-
On some works of Boris Teodorovich Polyak on the convergence of gradient methods and their development
Zh. Vychisl. Mat. Mat. Fiz., 64:4 (2024), 587–626
-
Analogues of the relative strong convexity condition for relatively smooth problems and adaptive gradient-type methods
Computer Research and Modeling, 15:2 (2023), 413–432
-
Subgradient methods for weakly convex and relatively weakly convex problems with a sharp minimum
Computer Research and Modeling, 15:2 (2023), 393–412
-
Solving strongly convex-concave composite saddle-point problems with low dimension of one group of variable
Mat. Sb., 214:3 (2023), 3–53
-
Adaptive Subgradient Methods for Mathematical Programming Problems with Quasiconvex Functions
Trudy Inst. Mat. i Mekh. UrO RAN, 29:3 (2023), 7–25
-
Subgradient methods for non-smooth optimization problems with some relaxation of sharp minimum
Computer Research and Modeling, 14:2 (2022), 473–495
-
Adaptive first-order methods for relatively strongly convex optimization problems
Computer Research and Modeling, 14:2 (2022), 445–472
-
Numerical Methods for Some Classes of Variational Inequalities with Relatively Strongly Monotone Operators
Mat. Zametki, 112:6 (2022), 879–894
-
Adaptive gradient-type methods for optimization problems with relative error and sharp minimum
Trudy Inst. Mat. i Mekh. UrO RAN, 27:4 (2021), 175–188
-
Mirror descent for constrained optimization problems with large subgradient values of functional constraints
Computer Research and Modeling, 12:2 (2020), 301–317
-
On some algorithms for constrained optimization problems with relative accuracy with respect to the objective functional
Trudy Inst. Mat. i Mekh. UrO RAN, 26:3 (2020), 198–210
-
Accelerated methods for saddle-point problem
Zh. Vychisl. Mat. Mat. Fiz., 60:11 (2020), 1843–1866
-
One method for minimization a convex Lipschitz-continuous function of two variables on a fixed square
Computer Research and Modeling, 11:3 (2019), 379–395
-
Adaptive mirror descent algorithms for convex and strongly convex optimization problems with functional constraints
Diskretn. Anal. Issled. Oper., 26:3 (2019), 88–114
-
Hahn–Banach type theorems on functional separation for convex ordered normed cones
Eurasian Math. J., 10:1 (2019), 59–79
-
An adaptive analog of Nesterov's method for variational inequalities with a strongly monotone operator
Sib. Zh. Vychisl. Mat., 22:2 (2019), 201–211
-
Adaptation to inexactness for some gradient-type optimization methods
Trudy Inst. Mat. i Mekh. UrO RAN, 25:4 (2019), 210–225
-
On the Adaptive Proximal Method for a Class of Variational Inequalities and Related Problems
Trudy Inst. Mat. i Mekh. UrO RAN, 25:2 (2019), 185–197
-
An adaptive proximal method for variational inequalities
Zh. Vychisl. Mat. Mat. Fiz., 59:5 (2019), 889–894
-
A Sublinear Analog of the Banach–Mazur Theorem in Separable Convex Cones with Norm
Mat. Zametki, 104:1 (2018), 118–130
-
On Sublinear Analogs of Weak Topologies in Normed Cones
Mat. Zametki, 103:5 (2018), 794–800
-
Adaptive mirror descent algorithms in convex programming problems with Lipschitz constraints
Trudy Inst. Mat. i Mekh. UrO RAN, 24:2 (2018), 266–279
-
On some problem of set-valued analysis in asymmetric normed spaces
Taurida Journal of Computer Science Theory and Mathematics, 2017, no. 1, 82–94
-
An analogue of the Hahn–Banach theorem for functionals on abstract convex cones
Eurasian Math. J., 7:3 (2016), 89–99
-
Analogs of the Schauder Theorem that Use Anticompacta
Mat. Zametki, 99:6 (2016), 950–953
-
Sequential analogues of the Lyapunov and Krein–Milman theorems in Fréchet spaces
CMFD, 57 (2015), 162–183
-
Applications of anticompact sets to analogs of Denjoy–Young–Saks and Lebesgue theorems
Eurasian Math. J., 6:1 (2015), 115–122
-
Anti-compacts and their applications to analogs of Lyapunov and Lebesgue theorems in Frechét spaces
CMFD, 53 (2014), 155–176
-
The limiting form of the Radon–Nikodym property is true for all Fréchet spaces
CMFD, 37 (2010), 55–69
-
Compact subdifferentials: the formula of finite increments and related topics
CMFD, 34 (2009), 121–138
© , 2026