Abstract:
The paper deals with a general problem of convex stochastic optimization in a space of small dimension (for example, 100 variables). It is known that for deterministic problems of convex optimization in small dimensions, the methods of centers of gravity type (for example, Vaidya's method) provide the best convergence. For stochastic optimization problems, the question of the possibility of applying Vaidya's method can be reduced to the question of how it accumulates inaccuracies in the subgradient. A recent result of the authors stating that there is no accumulation of inaccuracies at the iterations of Vaidya's method allows the authors to propose its analog for solving stochastic optimization problems. The main technique is to replace the subgradient in Vaidya's method by its batched analogue (the arithmetic mean of stochastic subgradients). In the present paper, this plan is implemented, which results in an efficient method (under conditions of the possibility of parallel calculations with batching) for solving problems of convex stochastic optimization in spaces of small dimensions. The work of the algorithm is illustrated by a numerical experiment.