After a two-class kernel Fisher Discriminant Analysis (KFDA) has been trained on the full dataset, matrix inverse updates allow for the direct calculation of out-of-sample predictions for different test sets. Here, this approach is extended to the multi-class case by casting KFDA in an Optimal Scoring framework. In simulations using 10-fold cross-validation and permutation tests the approach is shown to be more than 1000x faster than retraining the classifier in each fold. Direct out-of-sample predictions can be useful on large datasets and in studies with many training-testing iterations
By applying recent results in optimization theory variously known as optimization transfer or majori...
Sparsity-inducing multiple kernel Fisher discriminant analysis (MK-FDA) has been studied in the lite...
We introduce and explore an approach to estimating statistical significance of classification accura...
Given n training examples, the training of a Kernel Fisher Discriminant (KFD) classifier corresponds...
Mika et al. [1] apply the “kernel trick ” to obtain a non-linear variant of Fisher’s linear discrimi...
By applying recent results in optimization transfer, a new algorithm for kernel Fisher Discriminant ...
Mika et al. (1999) introduce a non-linear formulation of Fisher's linear discriminant, based the now...
Mika et al. (1999) introduce a non-linear formulation of Fisher's linear discriminant, based the now...
Mika et al. (in: Neural Network for Signal Processing, Vol. IX, IEEE Press, New York, 1999; pp. 41–4...
Mika et al. [1] introduce a non-linear formulation of the Fisher discriminant based the well-known "...
Kernel Fisher discriminant analysis (KFDA) is a very popular learning method for the purpose of clas...
A computationally efficient approach has been developed to perform two-group linear discriminant ana...
We review a multiple kernel learning (MKL) technique called ℓp regularised multiple kernel Fisher di...
We propose a highly efficient framework for kernel multi-class models with a large and structured se...
Sparsity-inducing multiple kernel Fisher discriminant analysis (MK-FDA) has been studied in the lite...
By applying recent results in optimization theory variously known as optimization transfer or majori...
Sparsity-inducing multiple kernel Fisher discriminant analysis (MK-FDA) has been studied in the lite...
We introduce and explore an approach to estimating statistical significance of classification accura...
Given n training examples, the training of a Kernel Fisher Discriminant (KFD) classifier corresponds...
Mika et al. [1] apply the “kernel trick ” to obtain a non-linear variant of Fisher’s linear discrimi...
By applying recent results in optimization transfer, a new algorithm for kernel Fisher Discriminant ...
Mika et al. (1999) introduce a non-linear formulation of Fisher's linear discriminant, based the now...
Mika et al. (1999) introduce a non-linear formulation of Fisher's linear discriminant, based the now...
Mika et al. (in: Neural Network for Signal Processing, Vol. IX, IEEE Press, New York, 1999; pp. 41–4...
Mika et al. [1] introduce a non-linear formulation of the Fisher discriminant based the well-known "...
Kernel Fisher discriminant analysis (KFDA) is a very popular learning method for the purpose of clas...
A computationally efficient approach has been developed to perform two-group linear discriminant ana...
We review a multiple kernel learning (MKL) technique called ℓp regularised multiple kernel Fisher di...
We propose a highly efficient framework for kernel multi-class models with a large and structured se...
Sparsity-inducing multiple kernel Fisher discriminant analysis (MK-FDA) has been studied in the lite...
By applying recent results in optimization theory variously known as optimization transfer or majori...
Sparsity-inducing multiple kernel Fisher discriminant analysis (MK-FDA) has been studied in the lite...
We introduce and explore an approach to estimating statistical significance of classification accura...