We explore the validity of the 2-stage least squares estimator with l_{1}-regularization in both stages, for linear triangular models where the numbers of endogenous regressors in the main equation and instruments in the first-stage equations can exceed the sample size, and the regression coefficients are sufficiently sparse. For this l_{1}-regularized 2-stage least squares estimator, we first establish finite-sample performance bounds and then provide a simple practical method (with asymptotic guarantees) for choosing the regularization parameter. We also sketch an inference strategy built upon this practical method
The linear coefficient in a partially linear model with confounding variables can be estimated using...
Regression with L1-regularization, Lasso, is a popular algorithm for recovering the sparsity pattern...
In this paper we develop inference for high dimensional linear models, with serially correlated erro...
We explore the validity of the 2-stage least squares estimator with l_{1}-regularization in both sta...
We explore the validity of the 2-stage least squares estimator with l_{1}-regularization in both sta...
We explore the validity of the 2-stage least squares estimator with l_{1}-regularization in both sta...
We explore the validity of the 2-stage least squares estimator with l1−regularization in both stages...
We explore the validity of the 2-stage least squares estimator with l1−regularization in both stages...
This paper explores the validity of the two-stage estimation procedure for sparse linear models in h...
This paper explores the validity of the two-stage estimation procedure for sparse linear models in h...
This paper explores the validity of the two-stage estimation procedure for sparse linear models in h...
Econometric models based on observational data are often endogenous due to measurement error, autoco...
It is known that for a certain class of single index models (SIMs) zˇSc0, support recovery is imposs...
Due to the increasing availability of data sets with a large number of variables, sparse model estim...
The use of many moment conditions improves the asymptotic efficiency of the instrumental variables e...
The linear coefficient in a partially linear model with confounding variables can be estimated using...
Regression with L1-regularization, Lasso, is a popular algorithm for recovering the sparsity pattern...
In this paper we develop inference for high dimensional linear models, with serially correlated erro...
We explore the validity of the 2-stage least squares estimator with l_{1}-regularization in both sta...
We explore the validity of the 2-stage least squares estimator with l_{1}-regularization in both sta...
We explore the validity of the 2-stage least squares estimator with l_{1}-regularization in both sta...
We explore the validity of the 2-stage least squares estimator with l1−regularization in both stages...
We explore the validity of the 2-stage least squares estimator with l1−regularization in both stages...
This paper explores the validity of the two-stage estimation procedure for sparse linear models in h...
This paper explores the validity of the two-stage estimation procedure for sparse linear models in h...
This paper explores the validity of the two-stage estimation procedure for sparse linear models in h...
Econometric models based on observational data are often endogenous due to measurement error, autoco...
It is known that for a certain class of single index models (SIMs) zˇSc0, support recovery is imposs...
Due to the increasing availability of data sets with a large number of variables, sparse model estim...
The use of many moment conditions improves the asymptotic efficiency of the instrumental variables e...
The linear coefficient in a partially linear model with confounding variables can be estimated using...
Regression with L1-regularization, Lasso, is a popular algorithm for recovering the sparsity pattern...
In this paper we develop inference for high dimensional linear models, with serially correlated erro...