IMA Conference on The Mathematical Challenges of Big Data

Exploiting geometric structure in (matrix-valued) optimization @ IMA Big Data

Abstract

Many applications involve non-Euclidean data, such as graphs, strings, or matrices. In such cases, exploiting Riemannian geometry can deliver algorithms that are computationally superior to standard nonlinear programming approaches. This observation has resulted in an increasing interest in Riemannian methods in the optimization and machine learning community. In this talk, we consider the problem of optimizing a convex function on a (Riemannian) manifold subject to convex constraints. Several classical problems that arise in Machine Learning applications can be phrased as constrained optimization tasks on manifolds. This includes matrix-valued tasks, such as barycenter problems, as well as the computation of Brascamp-Lieb constants. The latter is of central importance in many areas of mathematics and computer science through connections to maximum likelihood estimators in Gaussian models, Tyler’s M-estimator of scatter matrices or the computation of marginals in quantum information theory. In this talk, we present algorithms for solving constrained optimization problems on manifolds and discuss applications to the two problems described above. Based on joint work with Suvrit Sra.

Date
Sep 19, 2022
Location
Oxford, UK
Melanie Weber
Melanie Weber
Assistant Professor