We propose a fast algorithm for ridge regression when the number of features is much larger than the number of observations (p≫n). The standard way to solve ridge regression in this setting works in the dual space and gives a running time of O(n2p). Our algorithm Subsampled Randomized Hadamard Transform - Dual Ridge Regression (SRHT-DRR) runs in time O(np log(n)) and works by preconditioning the design matrix by a Randomized Walsh-Hadamard Transform with a subsequent subsampling of features. We provide risk bounds for our SRHT-DRR algorithm in the fixed design setting and show experimental results on synthetic and real datasets
Ridge regression is a classical statistical technique that attempts to address the bias-variance tra...
This paper develops two orthogonal contributions to scalable sparse regression for competing risks t...
In several supervised learning applications, it happens that reconstruction methods have to be appli...
We propose a fast algorithm for ridge regression when the number of features is much larger than the...
We propose a fast algorithm for ridge regression when the number of features is much larger than the...
Firstly, we would like to state some lemmas and give some properties of Subsampled Randomized Hadama...
We address the problem of fast estimation of ordinary least squares (OLS) from large amounts of data...
We propose a new two stage algorithm LING for large scale regression problems. LING has the same ris...
We address the problem of fast estimation of ordinary least squares (OLS) from large amounts of data...
Subsampled Randomized Hadamard Transform (SRHT), a popular random projection method that can efficie...
In this paper we propose mathematical optimizations to select the optimal regularization parameter f...
We analyze the prediction error of ridge re- gression in an asymptotic regime where the sample size ...
We propose LOCO, a distributed algorithm which solves large-scale ridge reg-ression. LOCO randomly a...
One approach to improving the running time of kernel-based machine learning methods is to build a sm...
This paper presents an improved analysis of a structured dimension-reduction map called the subsampl...
Ridge regression is a classical statistical technique that attempts to address the bias-variance tra...
This paper develops two orthogonal contributions to scalable sparse regression for competing risks t...
In several supervised learning applications, it happens that reconstruction methods have to be appli...
We propose a fast algorithm for ridge regression when the number of features is much larger than the...
We propose a fast algorithm for ridge regression when the number of features is much larger than the...
Firstly, we would like to state some lemmas and give some properties of Subsampled Randomized Hadama...
We address the problem of fast estimation of ordinary least squares (OLS) from large amounts of data...
We propose a new two stage algorithm LING for large scale regression problems. LING has the same ris...
We address the problem of fast estimation of ordinary least squares (OLS) from large amounts of data...
Subsampled Randomized Hadamard Transform (SRHT), a popular random projection method that can efficie...
In this paper we propose mathematical optimizations to select the optimal regularization parameter f...
We analyze the prediction error of ridge re- gression in an asymptotic regime where the sample size ...
We propose LOCO, a distributed algorithm which solves large-scale ridge reg-ression. LOCO randomly a...
One approach to improving the running time of kernel-based machine learning methods is to build a sm...
This paper presents an improved analysis of a structured dimension-reduction map called the subsampl...
Ridge regression is a classical statistical technique that attempts to address the bias-variance tra...
This paper develops two orthogonal contributions to scalable sparse regression for competing risks t...
In several supervised learning applications, it happens that reconstruction methods have to be appli...