© 2018 Curran Associates Inc.All rights reserved. We consider a high dimensional linear regression problem where the goal is to efficiently recover an unknown vector β∗ from n noisy linear observations Y = Xβ∗ + W ∈ Rn, for known X ∈ Rn×p and unknown W ∈ Rn. Unlike most of the literature on this model we make no sparsity assumption on β∗. Instead we adopt a regularization based on assuming that the underlying vectors β∗ have rational entries with the same denominator Q ∈ Z>0. We call this Q-rationality assumption. We propose a new polynomial-time algorithm for this task which is based on the seminal Lenstra-Lenstra-Lovasz (LLL) lattice basis reduction algorithm. We establish that under the Q-rationality assumption, our algorithm recovers ex...
Abstract. We consider the problem of estimating an unknown vector θ from the noisy data Y = Aθ + ǫ, ...
In this paper, we study the problem of recovering a sparse signal x 2 Rn from highly corrupted linea...
Abstract The final step of some algebraic algorithms is to reconstruct the common denominator d of a...
© 2018 Curran Associates Inc.All rights reserved. We consider a high dimensional linear regression p...
This electronic version was submitted by the student author. The certified thesis is available in th...
This thesis shows how we can exploit low-dimensional structure in high-dimensional statistics and ma...
Due to the increasing availability of data sets with a large number of variables, sparse model estim...
High-dimensional statistical inference deals with models in which the number of parameters $p$ is co...
We focus on the high-dimensional linear regression problem, where the algorithmic goal is to efficie...
© 2016 NIPS Foundation - All Rights Reserved. We address the problem of recovering a high-dimensiona...
We study piece-wise constant signals corrupted by additive Gaussian noise over a d-dimensional latti...
We analyze a class of estimators based on a convex relaxation for solving high-dimensional matrix de...
International audienceIn many linear inverse problems, we want to estimate an unknown vector belongi...
We consider the problem of structurally con-strained high-dimensional linear regression. This has at...
International audienceWe propose and analyse a reduced-rank method for solving least-squares regress...
Abstract. We consider the problem of estimating an unknown vector θ from the noisy data Y = Aθ + ǫ, ...
In this paper, we study the problem of recovering a sparse signal x 2 Rn from highly corrupted linea...
Abstract The final step of some algebraic algorithms is to reconstruct the common denominator d of a...
© 2018 Curran Associates Inc.All rights reserved. We consider a high dimensional linear regression p...
This electronic version was submitted by the student author. The certified thesis is available in th...
This thesis shows how we can exploit low-dimensional structure in high-dimensional statistics and ma...
Due to the increasing availability of data sets with a large number of variables, sparse model estim...
High-dimensional statistical inference deals with models in which the number of parameters $p$ is co...
We focus on the high-dimensional linear regression problem, where the algorithmic goal is to efficie...
© 2016 NIPS Foundation - All Rights Reserved. We address the problem of recovering a high-dimensiona...
We study piece-wise constant signals corrupted by additive Gaussian noise over a d-dimensional latti...
We analyze a class of estimators based on a convex relaxation for solving high-dimensional matrix de...
International audienceIn many linear inverse problems, we want to estimate an unknown vector belongi...
We consider the problem of structurally con-strained high-dimensional linear regression. This has at...
International audienceWe propose and analyse a reduced-rank method for solving least-squares regress...
Abstract. We consider the problem of estimating an unknown vector θ from the noisy data Y = Aθ + ǫ, ...
In this paper, we study the problem of recovering a sparse signal x 2 Rn from highly corrupted linea...
Abstract The final step of some algebraic algorithms is to reconstruct the common denominator d of a...