Deep learning in hyperbolic space is quickly gaining traction in the fields of machine learning, multimedia, and computer vision. Deep networks commonly operate in Euclidean space, implicitly assuming that data lies on regular grids. Recent advances have shown that hyperbolic geometry provides a viable alternative foundation for deep learning, especially when data is hierarchical in nature and when working with few embedding dimensions. Currently however, no accessible open-source library exists to build hyperbolic network modules akin to well-known deep learning libraries. We present HypLL, the Hyperbolic Learning Library to bring the progress on hyperbolic deep learning together. HypLL is built on top of PyTorch, with an emphasis in its d...
Representation learning over temporal networks has drawn considerable attention in recent years. Eff...
How to efficiently learn discriminative deep features is arguably one of the core problems in deep l...
We take a non-Euclidean view at three classical machine learning subjects: low-dimensional embedding...
We propose a new class of deep reinforcement learning (RL) algorithms that model latent representati...
Graph-structured data are widespread in real-world applications, such as social networks, recommende...
Due to its geometric properties, hyperbolic space can support high-fidelity embeddings of tree- and ...
This paper introduces an end-to-end residual network that operates entirely on the Poincar\'e ball m...
Hyperbolic space can naturally embed hierarchies, unlike Euclidean space. Hyperbolic Neural Networks...
Hyperbolic space can naturally embed hierarchies that often exist in real-world data and semantics. ...
Recently, Hyperbolic Spaces in the context of Non-Euclidean Deep Learning have gained popularity bec...
Network science is driven by the question which properties large real-world networks have and how we...
Abstract Recently, hyperbolic deep neural networks (HDNNs) have been gaining momentum as the deep r...
Graph neural networks generalize conventional neural networks to graph-structured data and have rece...
The choice of geometric space for knowledge graph (KG) embeddings can have significant effects on th...
We introduce Mercator, a reliable embedding method to map real complex networks into their hyperbo...
Representation learning over temporal networks has drawn considerable attention in recent years. Eff...
How to efficiently learn discriminative deep features is arguably one of the core problems in deep l...
We take a non-Euclidean view at three classical machine learning subjects: low-dimensional embedding...
We propose a new class of deep reinforcement learning (RL) algorithms that model latent representati...
Graph-structured data are widespread in real-world applications, such as social networks, recommende...
Due to its geometric properties, hyperbolic space can support high-fidelity embeddings of tree- and ...
This paper introduces an end-to-end residual network that operates entirely on the Poincar\'e ball m...
Hyperbolic space can naturally embed hierarchies, unlike Euclidean space. Hyperbolic Neural Networks...
Hyperbolic space can naturally embed hierarchies that often exist in real-world data and semantics. ...
Recently, Hyperbolic Spaces in the context of Non-Euclidean Deep Learning have gained popularity bec...
Network science is driven by the question which properties large real-world networks have and how we...
Abstract Recently, hyperbolic deep neural networks (HDNNs) have been gaining momentum as the deep r...
Graph neural networks generalize conventional neural networks to graph-structured data and have rece...
The choice of geometric space for knowledge graph (KG) embeddings can have significant effects on th...
We introduce Mercator, a reliable embedding method to map real complex networks into their hyperbo...
Representation learning over temporal networks has drawn considerable attention in recent years. Eff...
How to efficiently learn discriminative deep features is arguably one of the core problems in deep l...
We take a non-Euclidean view at three classical machine learning subjects: low-dimensional embedding...