We present a novel approach to collaborative prediction, using low-norm instead of low-rank factorizations. The approach is inspired by, and has strong connections to, large-margin linear discrimination. We show how to learn low-norm factorizations by solving a semi-definite program, and present generalization error bounds based on analyzing the Rademacher complexity of low-norm factorizations.
The development of randomized algorithms for numerical linear algebra, e.g. for computing approximat...
As Web 2.0 and enterprise-cloud applications have proliferated, data mining algorithms increasingly ...
This work introduces Divide-Factor-Combine (DFC), a parallel divide-and-conquer framework for noisy ...
We present a novel approach to collaborative prediction, using low-norm instead of low-rank factori...
Maximum Margin Matrix Factorization (MMMF) was recently suggested (Srebro et al., 2005) as a convex,...
Low rank matrix factorization is an important step in many high dimensional machine learning algorit...
In this paper, we consider collaborative filtering as a ranking problem. We present a method which u...
Matrices that can be factored into a product of two simpler matricescan serve as a useful and often ...
Collaborative prediction is a powerful technique, useful in domains from recommender systems to guid...
In this paper, we consider collaborative filtering as a ranking problem. We present a method which u...
Collaborative filtering is a popular method for personalizing product recommendations. Maximum Margi...
Matrix factorization exploits the idea that, in complex high-dimensional data, the actual signal typ...
Low rank approximation is the problem of finding two low rank factors W and H such that the rank(WH)...
Collaborative filtering is a popular method for personalizing product recommendations. Maximum Margi...
In this paper, we consider collaborative filtering as a ranking problem. We present a method which u...
The development of randomized algorithms for numerical linear algebra, e.g. for computing approximat...
As Web 2.0 and enterprise-cloud applications have proliferated, data mining algorithms increasingly ...
This work introduces Divide-Factor-Combine (DFC), a parallel divide-and-conquer framework for noisy ...
We present a novel approach to collaborative prediction, using low-norm instead of low-rank factori...
Maximum Margin Matrix Factorization (MMMF) was recently suggested (Srebro et al., 2005) as a convex,...
Low rank matrix factorization is an important step in many high dimensional machine learning algorit...
In this paper, we consider collaborative filtering as a ranking problem. We present a method which u...
Matrices that can be factored into a product of two simpler matricescan serve as a useful and often ...
Collaborative prediction is a powerful technique, useful in domains from recommender systems to guid...
In this paper, we consider collaborative filtering as a ranking problem. We present a method which u...
Collaborative filtering is a popular method for personalizing product recommendations. Maximum Margi...
Matrix factorization exploits the idea that, in complex high-dimensional data, the actual signal typ...
Low rank approximation is the problem of finding two low rank factors W and H such that the rank(WH)...
Collaborative filtering is a popular method for personalizing product recommendations. Maximum Margi...
In this paper, we consider collaborative filtering as a ranking problem. We present a method which u...
The development of randomized algorithms for numerical linear algebra, e.g. for computing approximat...
As Web 2.0 and enterprise-cloud applications have proliferated, data mining algorithms increasingly ...
This work introduces Divide-Factor-Combine (DFC), a parallel divide-and-conquer framework for noisy ...