RUS  ENG
Full version
JOURNALS // Zhurnal Vychislitel'noi Matematiki i Matematicheskoi Fiziki // Archive

Zh. Vychisl. Mat. Mat. Fiz., 2020 Volume 60, Number 7, Pages 1143–1150 (Mi zvmmf11101)

This article is cited in 3 papers

A heuristic adaptive fast gradient method in stochastic optimization problems

A. V. Ogal'tsov, A. I. Turin

State University – Higher School of Economics, Moscow, 101000 Russia

Abstract: A fast adaptive heuristic stochastic gradient descent method is proposed. It is shown that this algorithm has a higher convergence rate in practical problems than currently popular optimization methods. Furthermore, a justification of this method is given, and difficulties that prevent obtaining optimal estimates for the proposed algorithm are described.

Key words: fast gradient descent, stochastic optimization, adaptive optimization.

UDC: 519.85

Received: 07.10.2019
Revised: 12.11.2019
Accepted: 10.03.2020

DOI: 10.31857/S004446692007008X


 English version:
Computational Mathematics and Mathematical Physics, 2020, 60:7, 1108–1115

Bibliographic databases:


© Steklov Math. Inst. of RAS, 2026