At present, Symmetric Positive Definite (SPD) matrix data is the most common non-Euclidean data in machine learning. Because SPD data don’t form a linear space, most machine learning algorithms can not be carried out directly on SPD data. The first purpose of this paper is to propose a new framework of SPD data machine learning, in which SPD data are transformed into the tangent spaces of Riemannian manifold, rather than a Reproducing Kernel Hilbert Space (RKHS) as usual. Domain adaption learning is a kind of machine learning. The second purpose of this paper is to apply the proposed framework to domain adaption learning (DAL), in which the architecture of bi-subspace learning is adopted. Compared with the commonly-used one subspace ...
In this paper, we propose a kernel for nonlinear dimensionality reduction over the manifold of Symme...
Unsupervised domain adaptation is effective in leveraging the rich information from the source domai...
The regularization principals [31] lead approximation schemes to deal with various learning problems...
Symmetric positive definite (SPD) data have become a hot topic in machine learning. Instead of a lin...
Domain adaptation learning is one of the fundamental research topics in pattern recognition and mach...
Domain adaptation learning is one of the fundamental research topics in pattern recognition and mach...
Covariance matrices, known as symmetric positive definite (SPD) matrices, are usually regarded as po...
The symmetric positive definite (SPD) matrices have been widely used in image and vision problems. R...
In this paper, we introduce a new domain adaptation (DA) algorithm where the source and target domai...
The manifold of Symmetric Positive Definite (SPD) matrices has been successfully used for data repre...
The domain adaptation (DA) problem on symmetric positive definite (SPD) manifolds has raised interes...
Symmetric Positive Definite (SPD) matrix learning methods have become popular in many image and vide...
Abstract—The regularization principals [31] lead approximation schemes to deal with various learning...
Recent advances suggest that a wide range of computer vision problems can be addressed more appropri...
Recent advances suggest that a wide range of computer vision problems can be addressed more appropri...
In this paper, we propose a kernel for nonlinear dimensionality reduction over the manifold of Symme...
Unsupervised domain adaptation is effective in leveraging the rich information from the source domai...
The regularization principals [31] lead approximation schemes to deal with various learning problems...
Symmetric positive definite (SPD) data have become a hot topic in machine learning. Instead of a lin...
Domain adaptation learning is one of the fundamental research topics in pattern recognition and mach...
Domain adaptation learning is one of the fundamental research topics in pattern recognition and mach...
Covariance matrices, known as symmetric positive definite (SPD) matrices, are usually regarded as po...
The symmetric positive definite (SPD) matrices have been widely used in image and vision problems. R...
In this paper, we introduce a new domain adaptation (DA) algorithm where the source and target domai...
The manifold of Symmetric Positive Definite (SPD) matrices has been successfully used for data repre...
The domain adaptation (DA) problem on symmetric positive definite (SPD) manifolds has raised interes...
Symmetric Positive Definite (SPD) matrix learning methods have become popular in many image and vide...
Abstract—The regularization principals [31] lead approximation schemes to deal with various learning...
Recent advances suggest that a wide range of computer vision problems can be addressed more appropri...
Recent advances suggest that a wide range of computer vision problems can be addressed more appropri...
In this paper, we propose a kernel for nonlinear dimensionality reduction over the manifold of Symme...
Unsupervised domain adaptation is effective in leveraging the rich information from the source domai...
The regularization principals [31] lead approximation schemes to deal with various learning problems...