The MAtrix, TEnsor, and Deep-learning Optimized Routines (MATEDOR) project seeks to develop software technologies and standard APIs, along with a sustainable and portable library for large-scale computations, but whose individual parts are very small matrix or tensor computations. The main target is the acceleration of applications from important fields that fit this profile, including deep learning, data mining, astrophysics, image and signal processing, hydrodynamics, and more.<br
Matrix factorizations have found two main applications in machine learning, namely for efficient dat...
To respond to the intense computational load of deep neural networks, a plethora of domain-specific ...
This report documents the program and the outcomes of Dagstuhl Seminar 22101 "Tensor Computations: A...
The MAtrix, TEnsor, and Deep-learning Optimized Routines (MATEDOR) project seeks to develop softwar...
Tensors are higher-dimensional analogs of matrices, and represent a key data abstraction for many ap...
© 1991-2012 IEEE. Tensors or multiway arrays are functions of three or more indices (i,j,k,⋯)-simila...
©2020 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for al...
140 pagesTensor algebra lives at the heart of big data applications. Where classical machine learnin...
The paper surveys the topic of tensor decompositions in modern machine learning applications. It foc...
Deep Learning (DL) has created a growing demand for simpler ways to develop complex models and effic...
© 2016 IEEE. We give an overview of recent developments in numerical optimization-based computation ...
Computational intensive applications such as pattern recognition, and natural language processing, a...
Popular Machine Learning (ML) and High Performance Computing (HPC) workloads contribute to a signifi...
"The fundamental laws necessary for the mathematical treatment of large part of physics and the whol...
To respond to the need for efficient training and inference of deep neural networks, a plethora of d...
Matrix factorizations have found two main applications in machine learning, namely for efficient dat...
To respond to the intense computational load of deep neural networks, a plethora of domain-specific ...
This report documents the program and the outcomes of Dagstuhl Seminar 22101 "Tensor Computations: A...
The MAtrix, TEnsor, and Deep-learning Optimized Routines (MATEDOR) project seeks to develop softwar...
Tensors are higher-dimensional analogs of matrices, and represent a key data abstraction for many ap...
© 1991-2012 IEEE. Tensors or multiway arrays are functions of three or more indices (i,j,k,⋯)-simila...
©2020 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for al...
140 pagesTensor algebra lives at the heart of big data applications. Where classical machine learnin...
The paper surveys the topic of tensor decompositions in modern machine learning applications. It foc...
Deep Learning (DL) has created a growing demand for simpler ways to develop complex models and effic...
© 2016 IEEE. We give an overview of recent developments in numerical optimization-based computation ...
Computational intensive applications such as pattern recognition, and natural language processing, a...
Popular Machine Learning (ML) and High Performance Computing (HPC) workloads contribute to a signifi...
"The fundamental laws necessary for the mathematical treatment of large part of physics and the whol...
To respond to the need for efficient training and inference of deep neural networks, a plethora of d...
Matrix factorizations have found two main applications in machine learning, namely for efficient dat...
To respond to the intense computational load of deep neural networks, a plethora of domain-specific ...
This report documents the program and the outcomes of Dagstuhl Seminar 22101 "Tensor Computations: A...