International audienceWe propose an ℓ1 criterion for dictionary learning for sparse signal representation. Instead of directly searching for the dictionary vectors, our dictionary learning approach identifies vectors that are orthogonal to the subspaces in which the training data concentrate. We study conditions on the coefficients of training data that guarantee that ideal normal vectors deduced from the dictionary are local optima of the criterion. We illustrate the behavior of the criterion on a 2D example, showing that the local minima correspond to ideal normal vectors when the number of training data is sufficient. We conclude by describing an algorithm that can be used to optimize the criterion in higher dimension
Dictionary Learning (DL) has seen widespread use in signal processing and machine learning. Given a ...
Many natural signals exhibit a sparse representation, whenever a suitable describing model is given....
By solving a linear inverse problem under a sparsity constraint, one can successfully recover the co...
International audienceWe propose an ℓ1 criterion for dictionary learning for sparse signal represent...
Future and Emerging Technologies (FET) programme within the Seventh Framework Programme for Research...
International audienceMany recent works have shown that if a given signal admits a sufficiently spar...
International audienceThis article treats the problem of learning a dictionary providing sparse repr...
This article treats the problem of learning a dictionary providing sparse representations for a give...
The idea that many classes of signals can be represented by linear combination of a small set of ato...
This article treats the problem of learning a dictionary providing sparse representations for a give...
The idea that many important classes of signals can be well-represented by linear combi-nations of a...
International audienceThis article treats the problem of learning a dictionary providing sparse repr...
The idea that many important classes of signals can be well-represented by linear combi-nations of a...
Sparse signal models approximate signals using a small number ofelements from a large set of vectors...
In order to find sparse approximations of signals, an appropriate generative model for the signal cl...
Dictionary Learning (DL) has seen widespread use in signal processing and machine learning. Given a ...
Many natural signals exhibit a sparse representation, whenever a suitable describing model is given....
By solving a linear inverse problem under a sparsity constraint, one can successfully recover the co...
International audienceWe propose an ℓ1 criterion for dictionary learning for sparse signal represent...
Future and Emerging Technologies (FET) programme within the Seventh Framework Programme for Research...
International audienceMany recent works have shown that if a given signal admits a sufficiently spar...
International audienceThis article treats the problem of learning a dictionary providing sparse repr...
This article treats the problem of learning a dictionary providing sparse representations for a give...
The idea that many classes of signals can be represented by linear combination of a small set of ato...
This article treats the problem of learning a dictionary providing sparse representations for a give...
The idea that many important classes of signals can be well-represented by linear combi-nations of a...
International audienceThis article treats the problem of learning a dictionary providing sparse repr...
The idea that many important classes of signals can be well-represented by linear combi-nations of a...
Sparse signal models approximate signals using a small number ofelements from a large set of vectors...
In order to find sparse approximations of signals, an appropriate generative model for the signal cl...
Dictionary Learning (DL) has seen widespread use in signal processing and machine learning. Given a ...
Many natural signals exhibit a sparse representation, whenever a suitable describing model is given....
By solving a linear inverse problem under a sparsity constraint, one can successfully recover the co...