There has been a surge of interest in learning non-linear manifold models to approximate high-dimensional data. Both for computational complex-ity reasons and for generalization capability, sparsity is a desired feature in such models. This usually means dimensionality reduction, which naturally implies estimating the intrinsic dimension, but it can also mean selecting a subset of the data to use as landmarks, which is especially im-portant because many existing algorithms have quadratic complexity in the number of observations. This paper presents an algorithm for select-ing landmarks, based on LASSO regression, which is well known to fa-vor sparse approximations because it uses regularization with an l1 norm. As an added benefit, a contin...
High-dimensional data sets are often analyzed and explored via the construction of a latent low-dime...
In many problem settings, parameter vectors are not merely sparse, but depen-dent in such a way that...
We discuss an application of sparsity to manifold learning. We show that the activation patterns of ...
High computational costs of manifold learning prohibit its application for large datasets. A common ...
In estimating the head pose angles in 3D space by manifold learning, the results currently are not v...
The LASSO sparse regression method has recently received attention in a variety of applications from...
The fact that image data samples lie on a manifold has been successfully exploited in many learning ...
It is difficult to find the optimal sparse solution of a manifold learning based dimensionality redu...
The fact that image data samples lie on a manifold has been successfully exploited in many learning ...
Regression with L1-regularization, Lasso, is a popular algorithm for recovering the sparsity pattern...
As an important pre-processing stage in many machine learning and pattern recognition domains, featu...
With the increasing availability of high dimensional data and demand in sophisticated data analysis ...
The Lasso is an attractive technique for regularization and variable selection for high-dimensional ...
In system identification, the Akaike Information Criterion (AIC) is a well known method to balance t...
We study the problem of discovering a manifold that best preserves information relevant to a nonline...
High-dimensional data sets are often analyzed and explored via the construction of a latent low-dime...
In many problem settings, parameter vectors are not merely sparse, but depen-dent in such a way that...
We discuss an application of sparsity to manifold learning. We show that the activation patterns of ...
High computational costs of manifold learning prohibit its application for large datasets. A common ...
In estimating the head pose angles in 3D space by manifold learning, the results currently are not v...
The LASSO sparse regression method has recently received attention in a variety of applications from...
The fact that image data samples lie on a manifold has been successfully exploited in many learning ...
It is difficult to find the optimal sparse solution of a manifold learning based dimensionality redu...
The fact that image data samples lie on a manifold has been successfully exploited in many learning ...
Regression with L1-regularization, Lasso, is a popular algorithm for recovering the sparsity pattern...
As an important pre-processing stage in many machine learning and pattern recognition domains, featu...
With the increasing availability of high dimensional data and demand in sophisticated data analysis ...
The Lasso is an attractive technique for regularization and variable selection for high-dimensional ...
In system identification, the Akaike Information Criterion (AIC) is a well known method to balance t...
We study the problem of discovering a manifold that best preserves information relevant to a nonline...
High-dimensional data sets are often analyzed and explored via the construction of a latent low-dime...
In many problem settings, parameter vectors are not merely sparse, but depen-dent in such a way that...
We discuss an application of sparsity to manifold learning. We show that the activation patterns of ...