Abstract:
This paper proposes a new variant of the adaptive Frank–Wolfe algorithm for relatively smooth convex minimization problems. It suggests using a divergence different from half of the squared Euclidean norm in the step size adjustment formula. Convergence rate estimates for this algorithm are proven for minimization problems involving relatively smooth convex functions with the triangle scaling property. We also conducted computational experiments for the Poisson linear inverse problem and SVM models. The paper also identifies the conditions under which the proposed algorithm shows a clear advantage over the adaptive proximal gradient Bregman method and its accelerated variants.