In this paper we develop inference for high dimensional linear models, with serially correlated errors. We examine Lasso under the assumption of strong mixing in the covariates and error process, allowing for fatter tails in their distribution. While the Lasso estimator performs poorly under such circumstances, we estimate via GLS Lasso the parameters of interest and extend the asymptotic properties of the Lasso under more general conditions. Our theoretical results indicate that the non-asymptotic bounds for stationary dependent processes are sharper, while the rate of Lasso under general conditions appears slower as $T,p\to \infty$. Further we employ the debiased Lasso to perform inference uniformly on the parameters of interest. Monte Ca...
In this thesis, we consider the linear regression model in the high dimensional setup. In particular...
In this paper we study post-penalized estimators which apply ordinary, unpenalized linear regression...
Regression with L1-regularization, Lasso, is a popular algorithm for recovering the sparsity pattern...
In recent years, extensive research has focused on the $\ell_1$ penalized least squares (Lasso) esti...
The desparsified lasso is a high-dimensional estimation method which provides uniformly valid infere...
The desparsified lasso is a high-dimensional estimation method which provides uniformly valid infere...
The desparsified lasso is a high-dimensional estimation method which provides uniformly valid infere...
In this paper we develop valid inference for high-dimensional time series. We extend the desparsifie...
The desparsified lasso is a high-dimensional estimation method which provides uniformly valid infere...
In this paper we develop valid inference for high-dimensional time series. We extend the desparsifie...
In this paper we develop valid inference for high-dimensional time series. We extend the desparsifie...
In this paper we study the asymptotic properties of the adaptive Lasso estimate in high dimensional ...
Serially correlated high-dimensional data are prevalent in the big data era. In order to predict and...
Serially correlated high-dimensional data are prevalent in the big data era. In order to predict and...
Nowadays an increasing amount of data is available and we have to deal with models in high dimension...
In this thesis, we consider the linear regression model in the high dimensional setup. In particular...
In this paper we study post-penalized estimators which apply ordinary, unpenalized linear regression...
Regression with L1-regularization, Lasso, is a popular algorithm for recovering the sparsity pattern...
In recent years, extensive research has focused on the $\ell_1$ penalized least squares (Lasso) esti...
The desparsified lasso is a high-dimensional estimation method which provides uniformly valid infere...
The desparsified lasso is a high-dimensional estimation method which provides uniformly valid infere...
The desparsified lasso is a high-dimensional estimation method which provides uniformly valid infere...
In this paper we develop valid inference for high-dimensional time series. We extend the desparsifie...
The desparsified lasso is a high-dimensional estimation method which provides uniformly valid infere...
In this paper we develop valid inference for high-dimensional time series. We extend the desparsifie...
In this paper we develop valid inference for high-dimensional time series. We extend the desparsifie...
In this paper we study the asymptotic properties of the adaptive Lasso estimate in high dimensional ...
Serially correlated high-dimensional data are prevalent in the big data era. In order to predict and...
Serially correlated high-dimensional data are prevalent in the big data era. In order to predict and...
Nowadays an increasing amount of data is available and we have to deal with models in high dimension...
In this thesis, we consider the linear regression model in the high dimensional setup. In particular...
In this paper we study post-penalized estimators which apply ordinary, unpenalized linear regression...
Regression with L1-regularization, Lasso, is a popular algorithm for recovering the sparsity pattern...