Abstract:
The possibility of accelerating a projection algorithm onto dominant singular spaces in the problem of recovering a low-rank matrix from a small number of its entries is explored. The underlying idea is to replace best approximation procedures in the Frobenius norm by fast approximation algorithms. Two methods for computing such approximations are considered: (a) projection onto random subspaces and (b) the cross approximation method. Theorems on the geometric convergence of the algorithms with approximate projections are proved. Numerical experiments are described that demonstrate the efficiency of both variants as compared with the exact projection.
Key words:low-rank matrices, matrix completion, singular value projection, cross approximation method, random subspaces.