Regression from high dimensional observation vectors is par-ticularly difficult when training data is limited. More specifi-cally, if the number of sample vectors n is less than dimension of the sample vectors p, then accurate regression is difficult to perform without prior knowledge of the data covariance. In this paper, we propose a novel approach to high dimen-sional regression for application when n < p. The approach works by first decorrelating the high dimensional observation vector using the sparse matrix transform (SMT) estimate of the data covariance. Then the decorrelated observations are used in a regularized regression procedure such as Lasso or shrinkage. Numerical results demonstrate that the proposed regression approach c...
Recently the problem of dimensionality reduction (or, subspace learning) has received a lot of inter...
For multiple index models, it has recently been shown that the sliced inverse regression (SIR) is co...
In high dimensional statistics, estimation and inference are often done by making use of the underly...
Due to the increasing availability of data sets with a large number of variables, sparse model estim...
Many problems in statistical pattern recognition and analysis require the classifcation and analysis...
In this work, a set of new tools is developed for modeling and processing of high dimensional signal...
High-dimensional datasets, where the number of measured variables is larger than the sample size, ar...
<div><p>Recent years have seen active developments of various penalized regression methods, such as ...
The public defense on 14th May 2020 at 16:00 (4 p.m.) will be organized via remote technology. Li...
In this paper, we consider the Group Lasso estimator of the covariance matrix of a stochastic proces...
We propose covariance-regularized regression, a family of methods for prediction in high dimensional...
High-dimensional matrix data are common in modern data analysis. Simply applying Lasso after vectori...
Covariance matrix estimation plays an important role in statistical analysis in many fields, includi...
In high-dimensional regression problems, a key aim is to identify a sparse model that fits the data...
We review recent results for high-dimensional sparse linear regression in the practical case of unkn...
Recently the problem of dimensionality reduction (or, subspace learning) has received a lot of inter...
For multiple index models, it has recently been shown that the sliced inverse regression (SIR) is co...
In high dimensional statistics, estimation and inference are often done by making use of the underly...
Due to the increasing availability of data sets with a large number of variables, sparse model estim...
Many problems in statistical pattern recognition and analysis require the classifcation and analysis...
In this work, a set of new tools is developed for modeling and processing of high dimensional signal...
High-dimensional datasets, where the number of measured variables is larger than the sample size, ar...
<div><p>Recent years have seen active developments of various penalized regression methods, such as ...
The public defense on 14th May 2020 at 16:00 (4 p.m.) will be organized via remote technology. Li...
In this paper, we consider the Group Lasso estimator of the covariance matrix of a stochastic proces...
We propose covariance-regularized regression, a family of methods for prediction in high dimensional...
High-dimensional matrix data are common in modern data analysis. Simply applying Lasso after vectori...
Covariance matrix estimation plays an important role in statistical analysis in many fields, includi...
In high-dimensional regression problems, a key aim is to identify a sparse model that fits the data...
We review recent results for high-dimensional sparse linear regression in the practical case of unkn...
Recently the problem of dimensionality reduction (or, subspace learning) has received a lot of inter...
For multiple index models, it has recently been shown that the sliced inverse regression (SIR) is co...
In high dimensional statistics, estimation and inference are often done by making use of the underly...