This thesis concerns the optimization and application of low-rank methods, with a special focus on tensor trains (TTs). In particular, we develop methods for computing TT approximations of a given tensor in a variety of low-rank formats and we show how to solve the tensor completion problem for TTs using Riemannian methods. This is then applied to train a machine learning (ML) estimator based on discretized functions. We also study randomized methods for obtaining low-rank approximations of matrices and tensors. Finally, we consider how such randomized methods can be used to solve general linear matrix and tensor equations.</p
Tensor representation is helpful to reduce the small sample size problem in discriminative subspace ...
The numerical solution of partial differential equations on high-dimensional domains gives rise to c...
The numerical solution of partial differential equations on high-dimensional domains gives rise to c...
This thesis concerns the optimization and application of low-rank methods, with a special focus on t...
Les tenseurs sont une généralisation d'ordre supérieur des matrices. Ils apparaissent dans une myria...
The book provides an introduction of very recent results about the tensors and mainly focuses on the...
Matrix and tensor completion arise in many different real-world applications related to the inferenc...
International audienceTensor methods are among the most prominent tools for the numerical solution o...
In recent years, tensors have found application in a growing number of applications, and tensor deco...
Linear algebra is the foundation of machine learning, especially for handling big data. We want to e...
Abstract—We present a new method for online prediction and learning of tensors (N-way arrays N> 2...
Computing low-rank approximations is one of the most important and well-studied problems involving t...
In tensor completion, the goal is to fill in missing entries of a partially known tensor under a low...
Computing low-rank approximations is one of the most important and well-studied problems involving t...
Submitted to SIAM Journal on Scientific ComputingComputing low-rank approximations is one of the mos...
Tensor representation is helpful to reduce the small sample size problem in discriminative subspace ...
The numerical solution of partial differential equations on high-dimensional domains gives rise to c...
The numerical solution of partial differential equations on high-dimensional domains gives rise to c...
This thesis concerns the optimization and application of low-rank methods, with a special focus on t...
Les tenseurs sont une généralisation d'ordre supérieur des matrices. Ils apparaissent dans une myria...
The book provides an introduction of very recent results about the tensors and mainly focuses on the...
Matrix and tensor completion arise in many different real-world applications related to the inferenc...
International audienceTensor methods are among the most prominent tools for the numerical solution o...
In recent years, tensors have found application in a growing number of applications, and tensor deco...
Linear algebra is the foundation of machine learning, especially for handling big data. We want to e...
Abstract—We present a new method for online prediction and learning of tensors (N-way arrays N> 2...
Computing low-rank approximations is one of the most important and well-studied problems involving t...
In tensor completion, the goal is to fill in missing entries of a partially known tensor under a low...
Computing low-rank approximations is one of the most important and well-studied problems involving t...
Submitted to SIAM Journal on Scientific ComputingComputing low-rank approximations is one of the mos...
Tensor representation is helpful to reduce the small sample size problem in discriminative subspace ...
The numerical solution of partial differential equations on high-dimensional domains gives rise to c...
The numerical solution of partial differential equations on high-dimensional domains gives rise to c...