In this paper, we present an information-theoretic approach to learning a Mahalanobis distance function. We formulate the problem as that of minimizing the differential relative entropy between two multivariate Gaussians under constraints on the distance function. We express this problem as a particular Bregman optimization problem---that of minimizing the LogDet divergence subject to linear constraints. Our resulting algorithm has several advantages over existing methods. First, our method can handle a wide variety of constraints and can optionally incorporate a prior on the distance function. Second, it is fast and scalable. Unlike most existing methods, no eigenvalue computations or semi-definite programming are required. We also present...
Many learning algorithms use a metric defined over the input space as a principal tool, and their pe...
For many machine learning algorithms such as k-nearest neighbor ( k-NN) classifiers and k-means clus...
Vectored data frequently occur in a variety of fields, which are easy to handle since they can be ma...
In this paper, we present an information-theoretic approach to learning a Mahalanobis distance funct...
In this paper, we present an information-theoretic approach to learning a Mahalanobis distance funct...
Over the past decades, distance metric learning has attracted a lot of interest in machine learning ...
Recent studies [1]-[5] have suggested using constraints in the form of relative distance comparisons...
We present a class of statistical learning algorithms formulated in terms of minimizing Bregman dist...
Learning distance functions with side information plays a key role in many data mining applications....
A distance metric that can accurately reflect the intrinsic characteristics of data is critical for ...
A distance metric that can accurately reflect the intrinsic characteristics of data is critical for ...
Learning distance functions with side information plays a key role in many data mining applications....
Supervised metric learning plays a substantial role in statistical classification. Conventional metr...
In this paper, we present a novel two-stage metric learning algorithm. We first map each learning in...
Supervised metric learning plays a substantial role in statistical classification. Conventional metr...
Many learning algorithms use a metric defined over the input space as a principal tool, and their pe...
For many machine learning algorithms such as k-nearest neighbor ( k-NN) classifiers and k-means clus...
Vectored data frequently occur in a variety of fields, which are easy to handle since they can be ma...
In this paper, we present an information-theoretic approach to learning a Mahalanobis distance funct...
In this paper, we present an information-theoretic approach to learning a Mahalanobis distance funct...
Over the past decades, distance metric learning has attracted a lot of interest in machine learning ...
Recent studies [1]-[5] have suggested using constraints in the form of relative distance comparisons...
We present a class of statistical learning algorithms formulated in terms of minimizing Bregman dist...
Learning distance functions with side information plays a key role in many data mining applications....
A distance metric that can accurately reflect the intrinsic characteristics of data is critical for ...
A distance metric that can accurately reflect the intrinsic characteristics of data is critical for ...
Learning distance functions with side information plays a key role in many data mining applications....
Supervised metric learning plays a substantial role in statistical classification. Conventional metr...
In this paper, we present a novel two-stage metric learning algorithm. We first map each learning in...
Supervised metric learning plays a substantial role in statistical classification. Conventional metr...
Many learning algorithms use a metric defined over the input space as a principal tool, and their pe...
For many machine learning algorithms such as k-nearest neighbor ( k-NN) classifiers and k-means clus...
Vectored data frequently occur in a variety of fields, which are easy to handle since they can be ma...