Abstract. Most of the existing analysis methods for tensors (or multi-way arrays) only assume that tensors to be completed are of low rank. However, for example, when they are applied to tensor completion prob-lems, their prediction accuracy tends to be significantly worse when only limited entries are observed. In this paper, we propose to use relation-ships among data as auxiliary information in addition to the low-rank assumption to improve the quality of tensor decomposition. We introduce two regularization approaches using graph Laplacians induced from the relationships, and design iterative algorithms for approximate solutions. Numerical experiments on tensor completion using synthetic and bench-mark datasets show that the use of auxi...
The goal of tensor completion is to recover a tensor from a subset of its entries, often by exploiti...
Matrix factorizations have found two main applications in machine learning, namely for efficient dat...
Abstract. While tensor factorizations have become increasingly popu-lar for learning on various form...
Statistical learning for tensors has gained increasing attention over the recent years. We will pres...
Matrix and tensor completion arise in many different real-world applications related to the inferenc...
factors capturing the tensor’s rank is proposed in this paper, as the key enabler for completion of ...
Abstract—We consider factoring low-rank tensors in the pres-ence of outlying slabs. This problem is ...
Low-rank tensor decomposition and completion have attracted significant interest from academia given...
In this thesis, we consider optimization problems that involve statistically estimating signals from...
Tensor completion is a fundamental tool to estimate unknown information from observed data, which is...
In tensor completion tasks, the traditional low-rank tensor decomposition models suffer from the lab...
How to handle large multi-dimensional datasets such as hyperspectral images and video information bo...
In this paper, we exploit the advantages of tensorial representations and propose several tensor lea...
Abstract—We present a new method for online prediction and learning of tensors (N-way arrays N> 2...
One of the popular approaches for low-rank tensor completion is to use the latent trace norm regular...
The goal of tensor completion is to recover a tensor from a subset of its entries, often by exploiti...
Matrix factorizations have found two main applications in machine learning, namely for efficient dat...
Abstract. While tensor factorizations have become increasingly popu-lar for learning on various form...
Statistical learning for tensors has gained increasing attention over the recent years. We will pres...
Matrix and tensor completion arise in many different real-world applications related to the inferenc...
factors capturing the tensor’s rank is proposed in this paper, as the key enabler for completion of ...
Abstract—We consider factoring low-rank tensors in the pres-ence of outlying slabs. This problem is ...
Low-rank tensor decomposition and completion have attracted significant interest from academia given...
In this thesis, we consider optimization problems that involve statistically estimating signals from...
Tensor completion is a fundamental tool to estimate unknown information from observed data, which is...
In tensor completion tasks, the traditional low-rank tensor decomposition models suffer from the lab...
How to handle large multi-dimensional datasets such as hyperspectral images and video information bo...
In this paper, we exploit the advantages of tensorial representations and propose several tensor lea...
Abstract—We present a new method for online prediction and learning of tensors (N-way arrays N> 2...
One of the popular approaches for low-rank tensor completion is to use the latent trace norm regular...
The goal of tensor completion is to recover a tensor from a subset of its entries, often by exploiti...
Matrix factorizations have found two main applications in machine learning, namely for efficient dat...
Abstract. While tensor factorizations have become increasingly popu-lar for learning on various form...