Sparse tensors arise in problems in science, engineering, machine learning, and data analytics. Programs that operate on such tensors can exploit sparsity to reduce storage requirements and computational time. Developing and maintaining sparse software by hand, however, is a complex and error-prone task. Therefore, we propose treating sparsity as a property of tensors, not a tedious implementation task, and letting a sparse compiler generate sparse code automatically from a sparsity-agnostic definition of the computation. This paper discusses integrating this idea into MLIR
The memory space taken to host and process large tensor graphs is a limiting factor for embedded Con...
Popular Machine Learning (ML) and High Performance Computing (HPC) workloads contribute to a signifi...
Tensors, linear-algebraic extensions of matrices in arbitrary dimensions, have numerous applications...
© 2020 Owner/Author. This paper shows how to generate code that efficiently converts sparse tensors ...
Thesis: Ph. D., Massachusetts Institute of Technology, Department of Electrical Engineering and Comp...
Tensor and linear algebra is pervasive in data analytics and the physical sciences. Often the tensor...
This paper shows how to optimize sparse tensor algebraic expressions by introducing temporary tensor...
Thesis: M. Eng., Massachusetts Institute of Technology, Department of Electrical Engineering and Com...
This paper shows how to build a sparse tensor algebra compiler that is agnostic to tensor formats (d...
Thesis: S.M., Massachusetts Institute of Technology, Department of Electrical Engineering and Comput...
Linear-scaling algorithms must be developed in order to extend the domain of applicability of electr...
This paper shows how to compile sparse array programming languages. A sparse array programming langu...
140 pagesTensor algebra lives at the heart of big data applications. Where classical machine learnin...
Deploying deep learning models on various devices has become an important topic. The wave of hardwar...
Usage of high-level intermediate representations promises the generation of fast code from a high-le...
The memory space taken to host and process large tensor graphs is a limiting factor for embedded Con...
Popular Machine Learning (ML) and High Performance Computing (HPC) workloads contribute to a signifi...
Tensors, linear-algebraic extensions of matrices in arbitrary dimensions, have numerous applications...
© 2020 Owner/Author. This paper shows how to generate code that efficiently converts sparse tensors ...
Thesis: Ph. D., Massachusetts Institute of Technology, Department of Electrical Engineering and Comp...
Tensor and linear algebra is pervasive in data analytics and the physical sciences. Often the tensor...
This paper shows how to optimize sparse tensor algebraic expressions by introducing temporary tensor...
Thesis: M. Eng., Massachusetts Institute of Technology, Department of Electrical Engineering and Com...
This paper shows how to build a sparse tensor algebra compiler that is agnostic to tensor formats (d...
Thesis: S.M., Massachusetts Institute of Technology, Department of Electrical Engineering and Comput...
Linear-scaling algorithms must be developed in order to extend the domain of applicability of electr...
This paper shows how to compile sparse array programming languages. A sparse array programming langu...
140 pagesTensor algebra lives at the heart of big data applications. Where classical machine learnin...
Deploying deep learning models on various devices has become an important topic. The wave of hardwar...
Usage of high-level intermediate representations promises the generation of fast code from a high-le...
The memory space taken to host and process large tensor graphs is a limiting factor for embedded Con...
Popular Machine Learning (ML) and High Performance Computing (HPC) workloads contribute to a signifi...
Tensors, linear-algebraic extensions of matrices in arbitrary dimensions, have numerous applications...