Constrained optimization on manifolds @ SIAM MDS

Abstract

Many machine learning applications involve non-Euclidean data, such as graphs, strings, or matrices. In such cases, exploiting Riemannian geometry can deliver algorithms that are computationally superior to standard nonlinear programming approaches. This observation has resulted in an increasing interest in Riemannian methods in the optimization and machine learning community. In this talk, we study projection-free methods for constrained optimization on Riemannian manifolds. In particular, we study a class of Riemannian Frank-Wolfe (RFW) methods for geodesically convex tasks. At the heart of the algorithm lies a Riemannian “linear” oracle that determines the update conditioned on geodesically convex constraints. Developing means of solving this oracle efficiently is crucial to the competitiveness of the method. While in general a nonconvex semi-definite problem, we discuss matrix-valued tasks where the solution can be computed in closed form. Furthermore, we show that RFW converges at a nonasymptotic sublinear rate. Under stronger assumptions, we can even recover linear convergence. RFW naturally extends to stochastic and finite-sum settings. Here we show that RFW converges at a sublinear rate, even if the objective is nonconvex. Our non-asymptotic convergence rates recover the best-known rates of RFW’s Euclidean counterparts. Finally, we discuss a selection of benchmark tasks and compare against state-of-the-art methods, where we observe significant performance gains.

Date
Sep 27, 2022
Location
San Diego, CA
Melanie Weber
Melanie Weber
Assistant Professor