The kernel regularized least squares (KRLS) method uses the kernel trick to perform non-linear regression estimation. Its performance depends on proper selection of both a kernel function and a regularization parameter. In practice, cross-validation along with the Gaussian RBF kernel have been widely used for carrying out model selection for KRLS. However, when training data is scarce, this combination often leads to poor regression estimation. In order to mitigate this issue, we follow two lines of investigation in this paper. First, we explore a new type of kernel function that is less susceptible to overfitting than the RBF kernel. Then, we consider alternative parameter selection methods that have been shown to perform well for other re...
Proceedings of the International Conference on Science and Science Education August 2015, p. MA.8-12...
This paper introduces an automatic robust nonlinear identification algorithm using the leave-one-out...
The issues of estimation accuracy and statistical power in multiple regression with small samples ha...
The paper proposes to combine an orthogonal least squares (OLS) model selection with local regularis...
We propose the use of Kernel Regularized Least Squares (KRLS) for social science mod-eling and infer...
Kernel methods are a well-studied approach for addressing regression problems by implicitly mapping ...
Previously, weighted kernel regression (WKR) for solving small samples problem has been reported. In...
The note proposes an efficient nonlinear identification algorithm by combining a locally regularized...
The note proposes an efficient nonlinear identification algorithm by combining a locally regularized...
Combining orthogonal least squares (OLS) model selection with local regularisation or smoothing lead...
The paper proposes to combine an orthogonal least squares (OLS) subset model selection with local re...
A family of regularized least squares regression models in a Reproducing Kernel Hilbert Space is ext...
RLScore is a Python open source module for kernel based machine learning. The library provides imple...
Previously, weighted kernel regression (WKR) for solving small samples problem has been reported. Th...
We propose an efficient algorithm for calculating hold-out and cross-validation (CV) type of estimat...
Proceedings of the International Conference on Science and Science Education August 2015, p. MA.8-12...
This paper introduces an automatic robust nonlinear identification algorithm using the leave-one-out...
The issues of estimation accuracy and statistical power in multiple regression with small samples ha...
The paper proposes to combine an orthogonal least squares (OLS) model selection with local regularis...
We propose the use of Kernel Regularized Least Squares (KRLS) for social science mod-eling and infer...
Kernel methods are a well-studied approach for addressing regression problems by implicitly mapping ...
Previously, weighted kernel regression (WKR) for solving small samples problem has been reported. In...
The note proposes an efficient nonlinear identification algorithm by combining a locally regularized...
The note proposes an efficient nonlinear identification algorithm by combining a locally regularized...
Combining orthogonal least squares (OLS) model selection with local regularisation or smoothing lead...
The paper proposes to combine an orthogonal least squares (OLS) subset model selection with local re...
A family of regularized least squares regression models in a Reproducing Kernel Hilbert Space is ext...
RLScore is a Python open source module for kernel based machine learning. The library provides imple...
Previously, weighted kernel regression (WKR) for solving small samples problem has been reported. Th...
We propose an efficient algorithm for calculating hold-out and cross-validation (CV) type of estimat...
Proceedings of the International Conference on Science and Science Education August 2015, p. MA.8-12...
This paper introduces an automatic robust nonlinear identification algorithm using the leave-one-out...
The issues of estimation accuracy and statistical power in multiple regression with small samples ha...