Abstract:
We propose some accelerated methods for solving optimization problems under the condition of relatively smooth and relatively Lipschitz continuous functions with inexact oracle. We consider the problem of minimizing a convex, differentiable, and relatively smooth function relative to a reference convex function. The first proposed method is based on a similar triangles method with inexact oracle, which uses a special triangular scaling property of the Bregman divergence used. The other proposed methods are non-adaptive and adaptive (tuning to the relative smoothness parameter) accelerated Bregman proximal gradient methods with inexact oracle. These methods are universal in the sense that they apply not only to relatively smooth but also to relatively Lipschitz continuous optimization problems. We also introduce an adaptive intermediate Bregman method, which interpolates between slower but more robust non-accelerated algorithms and faster but less robust accelerated algorithms. We conclude the paper with the results of numerical experiments demonstrating the advantages of the proposed algorithms for the Poisson inverse problem.