DeepMath

Learning a robust large-margin classifier in hyperbolic space

Abstract

Recently, there has been a surge of interest in representation learning in hyperbolic spaces, driven by their ability to represent hierarchical data with significantly fewer dimensions than standard Euclidean spaces. However, the viability and benefits of hyperbolic spaces for downstream machine learning tasks have received less attention. While hyperbolic equivalents of support vector machines, deep neural networks and graph neural networks have been proposed, the performance of these algorithm in comparison with their Euclidean counterparts has, to the best of our knowledge, not been studied theoretically. Here, we present the first theoretical guarantees for learning a classifier in hyperbolic rather than Euclidean space. Specifically, we consider the problem of learning a large-margin classifier for data possessing a hierarchical structure. First, we introduce a hyperbolic perceptron algorithm, which provably converges to a separating hyperplane. We then provide an algorithm to efficiently learn a large- margin hyperplane, relying on the careful injection of adversarial examples. Finally, we prove that for hierarchical data that embeds well into hyperbolic space, the low embedding dimension ensures superior guarantees when learning the classifier directly in hyperbolic space.

Date
Nov 5, 2020
Location
virtual
Melanie Weber
Melanie Weber
Assistant Professor