The lasso procedure is an estimator-shrinkage and variable selection method. This paper shows that there always exists and interval of tuning parameter values such that the cor-responding mean spuared prediction error for the lasso estimator is smaller than for the ordinary least spuares estimator. For an estimator satisfying some condition such as unbi-asedness, the paper defines the corresponding generalized lasso estimator. Its mean squared prediction error is shown to be smaller than that of the estimator for values of the tuning parameter in some interval. This implies that all unbiased estimators are not admissible. Simulation results for five models support the theoretical results
Penalization methods have been shown to yield both consistent variable selection and oracle paramete...
We study the degrees of freedom of the Lasso in the framework of Stein's unbiased risk estimati...
The least absolute deviation (LAD) regression is a useful method for robust regression, and the leas...
The lasso procedure is an estimator-shrinkage and variable selection method. This paper shows that t...
The lasso procedure is an estimator-shrinkage and variable selection method. This paper shows that t...
© 2009 Australian Statistical Publishing Association Inc. Copyright © 2009 John Wiley & Sons, Inc.Th...
The "least absolute shrinkage and selection operator" ('lasso') has been widely used in regression s...
The least absolute shrinkage and selection operator ('lasso') has been widely used in regr...
We present upper and lower bounds for the prediction error of the Lasso. For the case of random Gaus...
The least absolute selection and shrinkage operator (LASSO) is a method of estimation for linear mod...
Variable selection is an important property of shrinkage methods. The adaptive lasso is an oracle pr...
Abstract: Regression with the lasso penalty is a popular tool for performing di-mension reduction wh...
We propose a new method to select the tuning parameter in lasso regression. Unlike the previous prop...
We propose a shrinkage procedure for simultaneous variable selection and estimation in generalized l...
Regression with the lasso penalty is a popular tool for performing dimension reduction when the numb...
Penalization methods have been shown to yield both consistent variable selection and oracle paramete...
We study the degrees of freedom of the Lasso in the framework of Stein's unbiased risk estimati...
The least absolute deviation (LAD) regression is a useful method for robust regression, and the leas...
The lasso procedure is an estimator-shrinkage and variable selection method. This paper shows that t...
The lasso procedure is an estimator-shrinkage and variable selection method. This paper shows that t...
© 2009 Australian Statistical Publishing Association Inc. Copyright © 2009 John Wiley & Sons, Inc.Th...
The "least absolute shrinkage and selection operator" ('lasso') has been widely used in regression s...
The least absolute shrinkage and selection operator ('lasso') has been widely used in regr...
We present upper and lower bounds for the prediction error of the Lasso. For the case of random Gaus...
The least absolute selection and shrinkage operator (LASSO) is a method of estimation for linear mod...
Variable selection is an important property of shrinkage methods. The adaptive lasso is an oracle pr...
Abstract: Regression with the lasso penalty is a popular tool for performing di-mension reduction wh...
We propose a new method to select the tuning parameter in lasso regression. Unlike the previous prop...
We propose a shrinkage procedure for simultaneous variable selection and estimation in generalized l...
Regression with the lasso penalty is a popular tool for performing dimension reduction when the numb...
Penalization methods have been shown to yield both consistent variable selection and oracle paramete...
We study the degrees of freedom of the Lasso in the framework of Stein's unbiased risk estimati...
The least absolute deviation (LAD) regression is a useful method for robust regression, and the leas...