Many problems encountered in machine learning and signal processing can be formulated as estimating a low-rank object from incomplete, and possibly corrupted, linear measurements; prominent examples include matrix completion and tensor completion. Through the lens of matrix and tensor factorization, one of the most popular approaches is to employ simple iterative algorithms such as gradient descent to recover the low-rank factors directly, which allow for small memory and computation footprints. However, the convergence rate of gradient descent depends linearly, and sometimes even quadratically, on the condition number of the low-rank object, and therefore, slows down painstakingly when the problem is ill-conditioned. This thesis introduces...
Matrix completion involves recovering a matrix from a subset of its entries by utilizing interdepend...
In this thesis, we consider optimization problems that involve statistically estimating signals from...
We study a scalable alternative to robust gradient descent (RGD) techniques that can be used when lo...
© 2016 IEEE. The paper looks at a scaled variant of the stochastic gradient descent algorithm for th...
We study low rank matrix and tensor completion and propose novel algorithms that employ adaptive sam...
We study low rank matrix and tensor completion and propose novel algorithms that employ adaptive sam...
This thesis concerns the optimization and application of low-rank methods, with a special focus on t...
Abstract. Higher-order low-rank tensors naturally arise in many applications including hyperspectral...
Low rank decomposition of tensors is a powerful tool for learning generative models. The uniqueness ...
We present a new mixed precision algorithm to compute low-rank matrix and tensor approximations, a f...
International audienceAttempts of studying implicit regularization associated to gradient descent (G...
We study the robust matrix completion problem for the low-rank Hankel matrix, which detects the spar...
Robust tensor CP decomposition involves decomposing a tensor into low rank and sparse components. We...
Hierarchical tensor representation , e.g. Tucker tensor format (Hackbusch), Multi-layer TDMCH (Meyer...
Many interesting problems in statistics and machine learning can be written as \(min_xF(x)=f(x)+g(x)...
Matrix completion involves recovering a matrix from a subset of its entries by utilizing interdepend...
In this thesis, we consider optimization problems that involve statistically estimating signals from...
We study a scalable alternative to robust gradient descent (RGD) techniques that can be used when lo...
© 2016 IEEE. The paper looks at a scaled variant of the stochastic gradient descent algorithm for th...
We study low rank matrix and tensor completion and propose novel algorithms that employ adaptive sam...
We study low rank matrix and tensor completion and propose novel algorithms that employ adaptive sam...
This thesis concerns the optimization and application of low-rank methods, with a special focus on t...
Abstract. Higher-order low-rank tensors naturally arise in many applications including hyperspectral...
Low rank decomposition of tensors is a powerful tool for learning generative models. The uniqueness ...
We present a new mixed precision algorithm to compute low-rank matrix and tensor approximations, a f...
International audienceAttempts of studying implicit regularization associated to gradient descent (G...
We study the robust matrix completion problem for the low-rank Hankel matrix, which detects the spar...
Robust tensor CP decomposition involves decomposing a tensor into low rank and sparse components. We...
Hierarchical tensor representation , e.g. Tucker tensor format (Hackbusch), Multi-layer TDMCH (Meyer...
Many interesting problems in statistics and machine learning can be written as \(min_xF(x)=f(x)+g(x)...
Matrix completion involves recovering a matrix from a subset of its entries by utilizing interdepend...
In this thesis, we consider optimization problems that involve statistically estimating signals from...
We study a scalable alternative to robust gradient descent (RGD) techniques that can be used when lo...