High-dimensional data sets are often analyzed and explored via the construction of a latent low-dimensional space which enables convenient visualization and efficient predictive modeling or clustering. For complex data structures, linear dimensionality reduction techniques like PCA may not be sufficiently flexible to enable low-dimensional representation. Non-linear dimension reduction techniques, like kernel PCA and autoencoders, suffer from loss of interpretability since each latent variable is dependent of all input dimensions. To address this limitation, we here present path lasso penalized autoencoders. This structured regularization enhances interpretability by penalizing each path through the encoder from an input to a latent variabl...
MOTIVATION: In some prediction analyses, predictors have a natural grouping structure and selecting ...
Various dimensionality reduction (DR) schemes have been developed for projecting high-dimensional da...
Abstract. We study the problem of signal estimation from non-linear observations when the signal bel...
Since every day more and more data is collected, it becomes more and more expensive to process. To r...
Dimensionality reduction is defined as the search for a low-dimensional space that captures the “ess...
The Lasso is an attractive technique for regularization and variable selection for high-dimensional ...
Generative dimensionality reduction methods play an important role in machine learning applications ...
There has been a surge of interest in learning non-linear manifold models to approximate high-dimens...
Recently the problem of dimensionality reduction (or, subspace learning) has received a lot of inter...
The autoencoder algorithm and its deep version as tra-ditional dimensionality reduction methods have...
The typical scenario that arises in most “big data” problems is one where the ambient dimension of t...
In this article, we propose a method called sequential Lasso (SLasso) for feature selection in spars...
Introduction. Dimensionality reduction is an important task in machine learning. It arrises when the...
We present in this paper a novel approach for training deterministic auto-encoders. We show that by ...
peer-reviewedWe develop a Smooth Lasso for sparse, high dimensional, contingency tables and compare ...
MOTIVATION: In some prediction analyses, predictors have a natural grouping structure and selecting ...
Various dimensionality reduction (DR) schemes have been developed for projecting high-dimensional da...
Abstract. We study the problem of signal estimation from non-linear observations when the signal bel...
Since every day more and more data is collected, it becomes more and more expensive to process. To r...
Dimensionality reduction is defined as the search for a low-dimensional space that captures the “ess...
The Lasso is an attractive technique for regularization and variable selection for high-dimensional ...
Generative dimensionality reduction methods play an important role in machine learning applications ...
There has been a surge of interest in learning non-linear manifold models to approximate high-dimens...
Recently the problem of dimensionality reduction (or, subspace learning) has received a lot of inter...
The autoencoder algorithm and its deep version as tra-ditional dimensionality reduction methods have...
The typical scenario that arises in most “big data” problems is one where the ambient dimension of t...
In this article, we propose a method called sequential Lasso (SLasso) for feature selection in spars...
Introduction. Dimensionality reduction is an important task in machine learning. It arrises when the...
We present in this paper a novel approach for training deterministic auto-encoders. We show that by ...
peer-reviewedWe develop a Smooth Lasso for sparse, high dimensional, contingency tables and compare ...
MOTIVATION: In some prediction analyses, predictors have a natural grouping structure and selecting ...
Various dimensionality reduction (DR) schemes have been developed for projecting high-dimensional da...
Abstract. We study the problem of signal estimation from non-linear observations when the signal bel...