International audienceSparsity-inducing penalties are useful tools in variational methods for machine learning. In this paper, we propose two block-coordinate descent strategies for learning a sparse multiclass support vector machine. The first one works by selecting a subset of features to be updated at each iteration, while the second one performs the selection among the training samples. These algorithms can be efficiently implemented thanks to the flexibility offered by recent randomized primal-dual proximal methods. Experiments carried out for the supervised classification of handwritten digits demonstrate the interest of considering the primal-dual approach in the context of block-coordinate descent, and the efficiency of the proposed...
Abstract We propose a randomized algorithm for large scale SVM learning which solvesthe problem by i...
The recently proposed projection twin support vector machine (PTSVM) is an excellent nonparallel cla...
In this paper, we propose a Sparse Random Features algorithm, which learns a sparse non-linear predi...
International audienceSparsity-inducing penalties are useful tools in variational methods for machin...
International audienceSparsity inducing penalizations are useful tools in variational methods for ma...
Abstract. Cascades of classifiers constitute an important architecture for fast object detection. Wh...
We propose a novel partial linearization based approach for optimizing the multi-class svm learning ...
The increasing number of classification applications in large data sets demands that efficient class...
We consider convex-concave saddle point problems with a separable structure and non-strongly convex ...
We propose a randomized algorithm for large scale SVM learning which solves the problem by iterating...
We propose a direct approach to learning sparse Support Vector Machine (SVM) prediction models for M...
Structural Support Vector Machines (SSVMs) and Conditional Random Fields (CRFs) are popular discrimi...
Over the past decades, Linear Programming (LP) has been widely used in different areas and considere...
This paper introduces 1 a new support vector machine (SVM) formulation to obtain sparse solutions in...
Abstract We propose a randomized algorithm for large scale SVM learning which solvesthe problem by i...
The recently proposed projection twin support vector machine (PTSVM) is an excellent nonparallel cla...
In this paper, we propose a Sparse Random Features algorithm, which learns a sparse non-linear predi...
International audienceSparsity-inducing penalties are useful tools in variational methods for machin...
International audienceSparsity inducing penalizations are useful tools in variational methods for ma...
Abstract. Cascades of classifiers constitute an important architecture for fast object detection. Wh...
We propose a novel partial linearization based approach for optimizing the multi-class svm learning ...
The increasing number of classification applications in large data sets demands that efficient class...
We consider convex-concave saddle point problems with a separable structure and non-strongly convex ...
We propose a randomized algorithm for large scale SVM learning which solves the problem by iterating...
We propose a direct approach to learning sparse Support Vector Machine (SVM) prediction models for M...
Structural Support Vector Machines (SSVMs) and Conditional Random Fields (CRFs) are popular discrimi...
Over the past decades, Linear Programming (LP) has been widely used in different areas and considere...
This paper introduces 1 a new support vector machine (SVM) formulation to obtain sparse solutions in...
Abstract We propose a randomized algorithm for large scale SVM learning which solvesthe problem by i...
The recently proposed projection twin support vector machine (PTSVM) is an excellent nonparallel cla...
In this paper, we propose a Sparse Random Features algorithm, which learns a sparse non-linear predi...