Discriminative training for structured outputs has found increasing applications in areas such as natural language processing, bioinformatics, information retrieval, and computer vision. Focusing on large-margin methods, the most general (in terms of loss function and model structure) training algorithms known to date are based on cutting-plane approaches. While these algorithms are very efficient for linear models, their training complexity becomes quadratic in the number of examples when kernels are used. To overcome this bottleneck, we propose new training algorithms that use approximate cutting planes and random sampling to enable efficient training with kernels. We prove that these algorithms have improved time complexity while providi...
* Both first authors contributed equally. Abstract. We propose to learn the kernel of an SVM as the ...
Structural support vector machines (SSVMs) are amongst the best performing methods for structured co...
It has been shown that many kernel methods can be equivalently formulated as minimal enclosing ball ...
Abstract. In this paper, we present an extensive study of the cutting-plane algorithm (CPA) applied ...
Abstract. In this paper, we propose three important enhancements of the approximate cutting plane al...
Training of Structural SVMs involves solving a large Quadratic Program (QP). One popular method for ...
Structured output prediction in machine learning is the study of learning to predict complex objects...
This manuscript describes a method for training linear SVMs (including binary SVMs, SVM regression, ...
Training a support vector machine on a data set of huge size with thousands of classes is a challeng...
Typically, nonlinear Support Vector Machines (SVMs) produce significantly higher classification qual...
Recently two kinds of reduction techniques which aimed at saving training time for SVM problems with...
Structural support vector machines (SSVMs) are amongst the best performing models for structured com...
Thesis (Ph.D. (Computer Engineering))--North-West University, Potchefstroom Campus, 2012As digital c...
Abstract Feature engineering is one of the most complex aspects of the sys-tem design in machine lea...
Many computer vision problems involve building automatic systems by extracting complex high-level in...
* Both first authors contributed equally. Abstract. We propose to learn the kernel of an SVM as the ...
Structural support vector machines (SSVMs) are amongst the best performing methods for structured co...
It has been shown that many kernel methods can be equivalently formulated as minimal enclosing ball ...
Abstract. In this paper, we present an extensive study of the cutting-plane algorithm (CPA) applied ...
Abstract. In this paper, we propose three important enhancements of the approximate cutting plane al...
Training of Structural SVMs involves solving a large Quadratic Program (QP). One popular method for ...
Structured output prediction in machine learning is the study of learning to predict complex objects...
This manuscript describes a method for training linear SVMs (including binary SVMs, SVM regression, ...
Training a support vector machine on a data set of huge size with thousands of classes is a challeng...
Typically, nonlinear Support Vector Machines (SVMs) produce significantly higher classification qual...
Recently two kinds of reduction techniques which aimed at saving training time for SVM problems with...
Structural support vector machines (SSVMs) are amongst the best performing models for structured com...
Thesis (Ph.D. (Computer Engineering))--North-West University, Potchefstroom Campus, 2012As digital c...
Abstract Feature engineering is one of the most complex aspects of the sys-tem design in machine lea...
Many computer vision problems involve building automatic systems by extracting complex high-level in...
* Both first authors contributed equally. Abstract. We propose to learn the kernel of an SVM as the ...
Structural support vector machines (SSVMs) are amongst the best performing methods for structured co...
It has been shown that many kernel methods can be equivalently formulated as minimal enclosing ball ...