We present a novel binary convex reformulation of the sparse regression problem that constitutes a new duality perspective. We devise a new cutting plane method and provide evidence that it can solve to provable optimality the sparse regression problem for sample sizes n and number of regressors p in the 100,000s, that is, two orders of magnitude better than the current state of the art, in seconds. The ability to solve the problem for very high dimensions allows us to observe new phase transition phenomena. Contrary to traditional complexity theory which suggests that the difficulty of a problem increases with problem size, the sparse regression problem has the property that as the number of samples n increases the problem becomes easier i...
We present a method for simultaneously performing bandwidth selection and variable selection in nonp...
Forward stagewise regression follows a very simple strategy for constructing a sequence of sparse re...
We investigate implicit regularization schemes for gradient descent methods applied to unpenalized l...
This paper proposes a new algorithm for multiple sparse regression in high dimensions, where the tas...
Due to the increasing availability of data sets with a large number of variables, sparse model estim...
We formulate the sparse classification problem of n samples with p features as a binary convex optim...
Recovery of an N-dimensional, K-sparse solution x from an M-dimensional vector of measurements y for...
In this paper, we address the challeng-ing problem of selecting tuning parame-ters for high-dimensio...
In sparse signal recovery of compressive sensing, the phase transition determines the edge, which se...
<p>Sufficient dimension reduction (SDR) is known to be a powerful tool for achieving data reduction ...
© 2015, The Author(s). We study the problem of statistical estimation with a signal known to be spar...
<div><p>Recent years have seen active developments of various penalized regression methods, such as ...
In this paper, we review state-of-the-art methods for feature selection in statistics with an applic...
Many modern problems in science and other areas involve extraction of useful information from so-cal...
We present a method for simultaneously performing bandwidth selection and variable selection in nonp...
Forward stagewise regression follows a very simple strategy for constructing a sequence of sparse re...
We investigate implicit regularization schemes for gradient descent methods applied to unpenalized l...
This paper proposes a new algorithm for multiple sparse regression in high dimensions, where the tas...
Due to the increasing availability of data sets with a large number of variables, sparse model estim...
We formulate the sparse classification problem of n samples with p features as a binary convex optim...
Recovery of an N-dimensional, K-sparse solution x from an M-dimensional vector of measurements y for...
In this paper, we address the challeng-ing problem of selecting tuning parame-ters for high-dimensio...
In sparse signal recovery of compressive sensing, the phase transition determines the edge, which se...
<p>Sufficient dimension reduction (SDR) is known to be a powerful tool for achieving data reduction ...
© 2015, The Author(s). We study the problem of statistical estimation with a signal known to be spar...
<div><p>Recent years have seen active developments of various penalized regression methods, such as ...
In this paper, we review state-of-the-art methods for feature selection in statistics with an applic...
Many modern problems in science and other areas involve extraction of useful information from so-cal...
We present a method for simultaneously performing bandwidth selection and variable selection in nonp...
Forward stagewise regression follows a very simple strategy for constructing a sequence of sparse re...
We investigate implicit regularization schemes for gradient descent methods applied to unpenalized l...