Many machine learning applications involve non-Euclidean data, such as graphs, strings or matrices. In such cases, exploiting Riemannian geometry can deliver algorithms that are computationally superior to standard (Euclidean) approaches. This has resulted in an increasing interest in Riemannian methods in the machine learning community. In this talk, I will present two lines of work that utilize Riemannian methods in machine learning. First, we consider the task of learning a robust classifier in hyperbolic space. Such spaces have received a surge of interest for representing large-scale, hierarchical data, due to the fact that they achieve better representation accuracy with fewer dimensions. We consider an adversarial approach for learning a robust large margin classifier that is provably efficient. We also discuss conditions under which such hyperbolic methods are guaranteed to outperform their Euclidean counterparts. Secondly, we introduce Riemannian Frank-Wolfe (RFW) methods for constrained optimization on manifolds. Here, we discuss matrix-valued tasks for which RFW improves on classical Euclidean approaches, including the computation of Riemannian centroids and the synchronization of data matrices.