Classification and supervised learning problems in general aim to choose a function that best describes a relation between a set of observed attributes and their corresponding outputs. We focus on binary classification, where the output is a binary response variable. In this dissertation, we seek motivation within statistical learning theory, which attempts to estimate how well a classification function generalizes with respect to unseen data in a probabilistic setting. We study linear programming formulations for finding a hyperplane that separates two sets of points. Such formulations were initially given by Mangasarian (1965) for the separable case, and more recently extended by "soft margin'' formulations that maximize the margin of sep...
We examine linear program (LP) approaches to boosting and demonstrate their efficient solution using...
In the context of learning theory many efforts have been devoted to developing classification algori...
Thesis (Ph.D.)--University of Washington, 2020We present several novel results on computational prob...
In pattern recognition and machine learning, a classification problem refers to finding an algorithm...
In pattern recognition and machine learning, a classification problem refers to finding an algorithm...
We formulate the sparse classification problem of n samples with p features as a binary convex optim...
Boosting, as one of the state-of-the-art classification approaches, is widely used in the industry f...
Classifiers favoring sparse solutions, such as support vector machines, relevance vector machines, L...
Finding a hyperplane that separates two classes of data points with the minimum number of misclassif...
When constructing a classifier, the probability of correct classification of future data points shou...
We provide a new formulation for the problem of learning the optimal classification tree of a given ...
International audienceThe goal of classifier combination can be briefly stated as combining the deci...
We provide a new formulation for the problem of learning the optimal classification tree of a given ...
We provide a new formulation for the problem of learning the optimal classification tree of a given ...
We propose an algorithm for a family of optimization problems where the objective can be decomposed ...
We examine linear program (LP) approaches to boosting and demonstrate their efficient solution using...
In the context of learning theory many efforts have been devoted to developing classification algori...
Thesis (Ph.D.)--University of Washington, 2020We present several novel results on computational prob...
In pattern recognition and machine learning, a classification problem refers to finding an algorithm...
In pattern recognition and machine learning, a classification problem refers to finding an algorithm...
We formulate the sparse classification problem of n samples with p features as a binary convex optim...
Boosting, as one of the state-of-the-art classification approaches, is widely used in the industry f...
Classifiers favoring sparse solutions, such as support vector machines, relevance vector machines, L...
Finding a hyperplane that separates two classes of data points with the minimum number of misclassif...
When constructing a classifier, the probability of correct classification of future data points shou...
We provide a new formulation for the problem of learning the optimal classification tree of a given ...
International audienceThe goal of classifier combination can be briefly stated as combining the deci...
We provide a new formulation for the problem of learning the optimal classification tree of a given ...
We provide a new formulation for the problem of learning the optimal classification tree of a given ...
We propose an algorithm for a family of optimization problems where the objective can be decomposed ...
We examine linear program (LP) approaches to boosting and demonstrate their efficient solution using...
In the context of learning theory many efforts have been devoted to developing classification algori...
Thesis (Ph.D.)--University of Washington, 2020We present several novel results on computational prob...