Abstract—Solutions of least squares support vector machines (LS-SVMs) are typically nonsparse. The sparseness is imposed by subsequently omitting data that introduce the smallest training errors and retraining the remaining data. Iterative retraining requires more intensive computations than training a single nonsparse LS-SVM. In this paper, we propose a new pruning algorithm for sparse LS-SVMs: the sequential minimal opti-mization (SMO) method is introduced into pruning process; in addition, instead of determining the pruning points by errors, we omit the data points that will introduce minimum changes to a dual objective function. This new criterion is computationally efficient. The effectiveness of the proposed method in terms of computa...
Starting from a reformulation of Cramer & Singer Mul- ticlass Kernel Machine, we propose a Sequentia...
Training a Support Vector Machine (SVM) requires the solution of a very large quadratic programming...
We propose in this work a nested version of the well\u2013known Sequential Minimal Optimization (SMO...
Solutions of least squares support vector machines (LS-SVMs) are typically nonsparse. The sparseness...
The support vector machine (SVM) is a method for classification and for function approximation. This...
which permits unrestricted use, distribution, and reproduction in any medium, provided the original ...
An improved iterative sparse algorithm is proposed to accelerate the execution of sparse least squar...
Suykens et al. [1] describes a form of kernel ridge regression known as the least-squares support ve...
In this letter, we comment on "Pruning Error Minimization in Least Squares Support Vector Machines"b...
© 2020 The Authors. In this paper, we propose an efficient Least Squares Support Vector Machine (LS-...
Abstract – Since the early 90’s, Support Vector Machines (SVM) are attracting more and more attentio...
This is an electronic version of the paper presented at the 19th European Symposium on Artificial Ne...
In the last decade Support Vector Machines (SVM) - introduced by Vapnik - have been successfully app...
Suykens et al. (Neurocomputing (2002), in press) describe a weighted least-squares formulation of th...
The Least Squares Support Vector Machine (LS-SVM) is a modified SVM with a ridge regression cost fun...
Starting from a reformulation of Cramer & Singer Mul- ticlass Kernel Machine, we propose a Sequentia...
Training a Support Vector Machine (SVM) requires the solution of a very large quadratic programming...
We propose in this work a nested version of the well\u2013known Sequential Minimal Optimization (SMO...
Solutions of least squares support vector machines (LS-SVMs) are typically nonsparse. The sparseness...
The support vector machine (SVM) is a method for classification and for function approximation. This...
which permits unrestricted use, distribution, and reproduction in any medium, provided the original ...
An improved iterative sparse algorithm is proposed to accelerate the execution of sparse least squar...
Suykens et al. [1] describes a form of kernel ridge regression known as the least-squares support ve...
In this letter, we comment on "Pruning Error Minimization in Least Squares Support Vector Machines"b...
© 2020 The Authors. In this paper, we propose an efficient Least Squares Support Vector Machine (LS-...
Abstract – Since the early 90’s, Support Vector Machines (SVM) are attracting more and more attentio...
This is an electronic version of the paper presented at the 19th European Symposium on Artificial Ne...
In the last decade Support Vector Machines (SVM) - introduced by Vapnik - have been successfully app...
Suykens et al. (Neurocomputing (2002), in press) describe a weighted least-squares formulation of th...
The Least Squares Support Vector Machine (LS-SVM) is a modified SVM with a ridge regression cost fun...
Starting from a reformulation of Cramer & Singer Mul- ticlass Kernel Machine, we propose a Sequentia...
Training a Support Vector Machine (SVM) requires the solution of a very large quadratic programming...
We propose in this work a nested version of the well\u2013known Sequential Minimal Optimization (SMO...