A common approach for compressing large-scale data is through matrix sketching. In this work, we consider the problem of recovering low-rank matrices from two noisy linear sketches using the double sketching algorithm discussed in Fazel et al. (2008). Using tools from non-asymptotic random matrix theory, we provide the first theoretical guarantees characterizing the error between the output of the double sketch algorithm and the ground truth low-rank matrix. We apply our result to the problems of low-rank matrix approximation and low-tubal-rank tensor recovery.Comment: Major revision. 21 pages, 4 figure
Abstract. Higher-order low-rank tensors naturally arise in many applications including hyperspectral...
Linear sketches are powerful algorithmic tools that turn an n-dimensional input into a concise lower...
It is often desirable to reduce the dimensionality of a large dataset by projecting it onto a low-di...
A common approach for compressing large-scale data is through matrix sketching. In this work, we con...
We study low rank matrix and tensor completion and propose novel algorithms that employ adaptive sam...
We study low rank matrix and tensor completion and propose novel algorithms that employ adaptive sam...
Recovering a low-rank tensor from incomplete information is a recurring problem in signal pro-cessin...
This paper describes a suite of algorithms for constructing low-rank approximations of an input matr...
Low-rank matrix recovery problems arise naturally as mathematical formulations of various inverse pr...
Recovering a low-rank tensor from incomplete information is a recurring problem in signal processing...
In recent years, the intrinsic low rank structure of some datasets has been extensively exploited to...
Recovering a low-rank tensor from incomplete information is a recurring problem in signal processing...
Many problems encountered in machine learning and signal processing can be formulated as estimating ...
This paper considers the problem of recovering an unknown sparse p × p matrix X from an m ×m matrix ...
This paper develops a suite of algorithms for constructing low-rank approximations of an input matri...
Abstract. Higher-order low-rank tensors naturally arise in many applications including hyperspectral...
Linear sketches are powerful algorithmic tools that turn an n-dimensional input into a concise lower...
It is often desirable to reduce the dimensionality of a large dataset by projecting it onto a low-di...
A common approach for compressing large-scale data is through matrix sketching. In this work, we con...
We study low rank matrix and tensor completion and propose novel algorithms that employ adaptive sam...
We study low rank matrix and tensor completion and propose novel algorithms that employ adaptive sam...
Recovering a low-rank tensor from incomplete information is a recurring problem in signal pro-cessin...
This paper describes a suite of algorithms for constructing low-rank approximations of an input matr...
Low-rank matrix recovery problems arise naturally as mathematical formulations of various inverse pr...
Recovering a low-rank tensor from incomplete information is a recurring problem in signal processing...
In recent years, the intrinsic low rank structure of some datasets has been extensively exploited to...
Recovering a low-rank tensor from incomplete information is a recurring problem in signal processing...
Many problems encountered in machine learning and signal processing can be formulated as estimating ...
This paper considers the problem of recovering an unknown sparse p × p matrix X from an m ×m matrix ...
This paper develops a suite of algorithms for constructing low-rank approximations of an input matri...
Abstract. Higher-order low-rank tensors naturally arise in many applications including hyperspectral...
Linear sketches are powerful algorithmic tools that turn an n-dimensional input into a concise lower...
It is often desirable to reduce the dimensionality of a large dataset by projecting it onto a low-di...