Various problems in nonnegative quadratic programming arise in the training of large margin classifiers. We derive multiplicative updates for these problems that converge monotonically to the desired solutions for hard and soft margin classifiers. The updates differ strikingly in form from other multiplicative updates used in machine learning. In this paper, we provide complete proofs of convergence for these updates and extend previous work to incorporate sum and box constraints in addition to nonnegativity
The dual formulation of the support vector machine (SVM) objective function is an instance of a nonn...
Abstract. Support Vector Machines nd maximal margin hyperplanes in a high dimensional feature space,...
International audienceMajorization-minimization algorithms consist of successively minimizing a sequ...
Many problems in neural computation and statistical learning involve optimizations with nonnegativit...
We derive multiplicative updates for solving the nonnegative quadratic programming problem in suppor...
We investigate a learning algorithm for the classification of nonnegative data by mixture models. Mu...
We derive multiplicative updates for solving the nonnegative quadratic programming problem in suppor...
We present a new class of perceptron-like algorithms with margin in which the “effective” learning r...
Nonnegativity constraints arise frequently in statistical learning and pattern recognition. Multipli...
We address the problem of binary linear classification with emphasis on algorithms that lead to sepa...
We address the problem of binary linear classification with emphasis on algorithms that lead to sepa...
A new incremental learning algorithm is described which approximates the maximal margin hyperplane w...
We present a family of Perceptron-like algorithms with margin in which both the “effective” learning...
A new incremental learning algorithm is described which approximates the maximal margin hyperplane ...
Abstract. Majorization-minimization algorithms consist of successively minimizing a sequence of uppe...
The dual formulation of the support vector machine (SVM) objective function is an instance of a nonn...
Abstract. Support Vector Machines nd maximal margin hyperplanes in a high dimensional feature space,...
International audienceMajorization-minimization algorithms consist of successively minimizing a sequ...
Many problems in neural computation and statistical learning involve optimizations with nonnegativit...
We derive multiplicative updates for solving the nonnegative quadratic programming problem in suppor...
We investigate a learning algorithm for the classification of nonnegative data by mixture models. Mu...
We derive multiplicative updates for solving the nonnegative quadratic programming problem in suppor...
We present a new class of perceptron-like algorithms with margin in which the “effective” learning r...
Nonnegativity constraints arise frequently in statistical learning and pattern recognition. Multipli...
We address the problem of binary linear classification with emphasis on algorithms that lead to sepa...
We address the problem of binary linear classification with emphasis on algorithms that lead to sepa...
A new incremental learning algorithm is described which approximates the maximal margin hyperplane w...
We present a family of Perceptron-like algorithms with margin in which both the “effective” learning...
A new incremental learning algorithm is described which approximates the maximal margin hyperplane ...
Abstract. Majorization-minimization algorithms consist of successively minimizing a sequence of uppe...
The dual formulation of the support vector machine (SVM) objective function is an instance of a nonn...
Abstract. Support Vector Machines nd maximal margin hyperplanes in a high dimensional feature space,...
International audienceMajorization-minimization algorithms consist of successively minimizing a sequ...