Tensor methods have emerged as a powerful paradigm for consistent learning of many latent variable models such as topic models, independent component analysis and dictionary learning. Model parameters are estimated via CP decomposition of the observed higher order input moments. However, in many domains, additional invariances such as shift invariances exist, enforced via models such as convolutional dictionary learning. In this paper, we develop novel tensor decomposition algorithms for parameter estimation of convolutional models. Our algorithm is based on the popular alternating least squares method, but with efficient projections onto the space of stacked circulant matrices. Our method is embarrassingly parallel and consists of simple o...
Dictionary learning algorithms are typically derived for dealing with one or two dimensional signals...
International audienceA new dictionary learning model is introduced where the dictionary matrix is c...
This thesis focuses on some fundamental problems in machine learning that are posed as nonconvex mat...
Tensor methods have emerged as a powerful paradigm for consistent learning of many latent variable m...
Tensor methods have emerged as a powerful paradigm for consistent learning of many latent variable m...
Tensor methods have emerged as a powerful paradigm for consistent learning of many latent vari-able ...
Unsupervised learning aims at the discovery of hidden structure that drives the observations in the ...
Spectral methods have been the mainstay in several domains such as machine learning, applied mathema...
We present an alternating least squares type numerical optimization scheme to estimate conditionally...
This note is a short version of that in [1]. It is intended as a survey for the 2015 Algorithmic Lea...
We give a new approach to the dictionary learning (also known as “sparse coding”) problem of recover...
We provide guarantees for learning latent variable models emphasizing on the overcomplete regime, wh...
We present a novel analysis of the dynamics of tensor power iterations in the overcomplete regime wh...
This work considers a computationally and statistically efficient parameter estimation method for a ...
This work considers a computationally and statistically efficient parameter estimation method for a ...
Dictionary learning algorithms are typically derived for dealing with one or two dimensional signals...
International audienceA new dictionary learning model is introduced where the dictionary matrix is c...
This thesis focuses on some fundamental problems in machine learning that are posed as nonconvex mat...
Tensor methods have emerged as a powerful paradigm for consistent learning of many latent variable m...
Tensor methods have emerged as a powerful paradigm for consistent learning of many latent variable m...
Tensor methods have emerged as a powerful paradigm for consistent learning of many latent vari-able ...
Unsupervised learning aims at the discovery of hidden structure that drives the observations in the ...
Spectral methods have been the mainstay in several domains such as machine learning, applied mathema...
We present an alternating least squares type numerical optimization scheme to estimate conditionally...
This note is a short version of that in [1]. It is intended as a survey for the 2015 Algorithmic Lea...
We give a new approach to the dictionary learning (also known as “sparse coding”) problem of recover...
We provide guarantees for learning latent variable models emphasizing on the overcomplete regime, wh...
We present a novel analysis of the dynamics of tensor power iterations in the overcomplete regime wh...
This work considers a computationally and statistically efficient parameter estimation method for a ...
This work considers a computationally and statistically efficient parameter estimation method for a ...
Dictionary learning algorithms are typically derived for dealing with one or two dimensional signals...
International audienceA new dictionary learning model is introduced where the dictionary matrix is c...
This thesis focuses on some fundamental problems in machine learning that are posed as nonconvex mat...