Kernel logistic regression models, like their linear counterparts, can be trained using the efficient iteratively reweighted least-squares (IRWLS) algorithm. This approach suggests an approximate leave-one-out cross-validation estimator based on an existing method for exact leave-one-out cross-validation of least-squares models. Results compiled over seven benchmark datasets are presented for kernel logistic regression with model selection procedures based on both conventional k-fold and approximate leave-one-out cross-validation criteria, demonstrating the proposed approach to be viable
We consider the problem of model (or variable) selection in the classical regression model based on ...
The paper proposes to combine an orthogonal least squares (OLS) model selection with local regularis...
The present manuscript mainly focus on cross-validation procedures (and in particular on leave-p-out...
Kernel logistic regression (KLR) is the kernel learning method best suited to binary pattern recogni...
While the model parameters of many kernel learning methods are given by the solution of a convex opt...
The kernel regularized least squares (KRLS) method uses the kernel trick to perform non-linear regre...
While the model parameters of a kernel machine are typically given by the solution of a convex optim...
We propose an efficient algorithm for calculating hold-out and cross-validation (CV) type of estimat...
Mika et al. [1] introduce a non-linear formulation of the Fisher discriminant based the well-known "...
This article gives a robust technique for model selection in regression models, an important aspect ...
This paper introduces an automatic robust nonlinear identification algorithm using the leave-one-out...
Mika et al. (1999) introduce a non-linear formulation of Fisher's linear discriminant, based the now...
A fast cross-validation algorithm for model selection in kernel ridge regression problems is propose...
A long-standing problem in classification is the determination of the regularization parameter. Near...
International audienceWe propose a model selection procedure in the context of matched case-control ...
We consider the problem of model (or variable) selection in the classical regression model based on ...
The paper proposes to combine an orthogonal least squares (OLS) model selection with local regularis...
The present manuscript mainly focus on cross-validation procedures (and in particular on leave-p-out...
Kernel logistic regression (KLR) is the kernel learning method best suited to binary pattern recogni...
While the model parameters of many kernel learning methods are given by the solution of a convex opt...
The kernel regularized least squares (KRLS) method uses the kernel trick to perform non-linear regre...
While the model parameters of a kernel machine are typically given by the solution of a convex optim...
We propose an efficient algorithm for calculating hold-out and cross-validation (CV) type of estimat...
Mika et al. [1] introduce a non-linear formulation of the Fisher discriminant based the well-known "...
This article gives a robust technique for model selection in regression models, an important aspect ...
This paper introduces an automatic robust nonlinear identification algorithm using the leave-one-out...
Mika et al. (1999) introduce a non-linear formulation of Fisher's linear discriminant, based the now...
A fast cross-validation algorithm for model selection in kernel ridge regression problems is propose...
A long-standing problem in classification is the determination of the regularization parameter. Near...
International audienceWe propose a model selection procedure in the context of matched case-control ...
We consider the problem of model (or variable) selection in the classical regression model based on ...
The paper proposes to combine an orthogonal least squares (OLS) model selection with local regularis...
The present manuscript mainly focus on cross-validation procedures (and in particular on leave-p-out...