Low-rank tensor recovery has been widely applied to computer vision and machine learning. Recently, tubal nuclear norm (TNN) based optimization is proposed with superior performance as compared to other tensor nuclear norms. However, one major limitation is its orientation sensitivity due to low-rankness strictly defined along tubal orientation and it cannot simultaneously model spectral low-rankness in multiple orientations. To this end, we introduce two new tensor norms called OITNN-O and OITNN-L to exploit multi-orientational spectral low-rankness for an arbitrary K-way (K ≥ 3) tensors. We further formulate two robust tensor decomposition models via the proposed norms and develop two algorithms as the solutions. Theoretically, we establi...
A framework for reliable seperation of a low-rank subspace from grossly corrupted multi-dimensional ...
Abstract Tensor singular value decomposition (t‐SVD) provides a novel way to decompose a tensor. It ...
Nonconvex regularization has been popularly used in low-rank matrix learning. However, extending it ...
Low-rank tensor recovery has been widely applied to computer vision and machine learning. Recently, ...
© 2019 IEEE. This work studies the low-rank tensor completion problem, which aims to exactly recover...
The tensor nuclear norm (TNN), defined as the sum of nuclear norms of frontal slices of the tensor i...
Many tasks in computer vision suffer from missing values in tensor data, i.e., multi-way data array....
Many tasks in computer vision suffer from missing values in tensor data, i.e., multi-way data array....
The goal of tensor completion is to recover a tensor from a subset of its entries, often by exploiti...
Low-Rank Tensor Recovery (LRTR), the higher order generalization of Low-Rank Matrix Recovery (LRMR),...
In recent years, tensor completion problem has received a significant amount of attention in compute...
Higher-order tensors are becoming prevalent in many scientific areas such as computer vision, social...
In this paper, we exploit the advantages of tensorial representations and propose several tensor lea...
In this paper, we exploit the advantages of tensorial representations and propose several tensor lea...
Inspired by the robustness and efficiency of the capped nuclear norm, in this paper, we apply it to ...
A framework for reliable seperation of a low-rank subspace from grossly corrupted multi-dimensional ...
Abstract Tensor singular value decomposition (t‐SVD) provides a novel way to decompose a tensor. It ...
Nonconvex regularization has been popularly used in low-rank matrix learning. However, extending it ...
Low-rank tensor recovery has been widely applied to computer vision and machine learning. Recently, ...
© 2019 IEEE. This work studies the low-rank tensor completion problem, which aims to exactly recover...
The tensor nuclear norm (TNN), defined as the sum of nuclear norms of frontal slices of the tensor i...
Many tasks in computer vision suffer from missing values in tensor data, i.e., multi-way data array....
Many tasks in computer vision suffer from missing values in tensor data, i.e., multi-way data array....
The goal of tensor completion is to recover a tensor from a subset of its entries, often by exploiti...
Low-Rank Tensor Recovery (LRTR), the higher order generalization of Low-Rank Matrix Recovery (LRMR),...
In recent years, tensor completion problem has received a significant amount of attention in compute...
Higher-order tensors are becoming prevalent in many scientific areas such as computer vision, social...
In this paper, we exploit the advantages of tensorial representations and propose several tensor lea...
In this paper, we exploit the advantages of tensorial representations and propose several tensor lea...
Inspired by the robustness and efficiency of the capped nuclear norm, in this paper, we apply it to ...
A framework for reliable seperation of a low-rank subspace from grossly corrupted multi-dimensional ...
Abstract Tensor singular value decomposition (t‐SVD) provides a novel way to decompose a tensor. It ...
Nonconvex regularization has been popularly used in low-rank matrix learning. However, extending it ...