AbstractMoving least-square (MLS) is an approximation method for data interpolation, numerical analysis and statistics. In this paper we consider the MLS method in learning theory for the regression problem. Essential differences between MLS and other common learning algorithms are pointed out: lack of a natural uniform bound for estimators and the pointwise definition. The sample error is estimated in terms of the weight function and the finite dimensional hypothesis space. The approximation error is dealt with for two special cases for which convergence rates for the total L2 error measuring the global approximation on the whole domain are provided
AbstractA standard assumption in theoretical study of learning algorithms for regression is uniform ...
AbstractWe study in detail the behavior of some known learning algorithms. We estimate the sum of th...
Abstract: Owing to the meshless and local character-istics, moving least squares (MLS) methods have ...
AbstractMoving least-square (MLS) is an approximation method for data interpolation, numerical analy...
Abstract We consider the moving least-squares (MLS) method by the regression learning framework unde...
AbstractIn this paper we apply a concentration technique to improve the convergence rates for a movi...
Abstract We consider the moving least-square (MLS) method by the coefficient-based regression framew...
Abstract. We describe two experiments recently conducted with the approximate moving least squares (...
The concise review systematically summarises the state-of-the-art variants of Moving Least Squares (...
The Moving Least Square (MLS) Method is an approach which is used in meshfree solutions and data app...
textabstractThe author presents a new method for estimating the parameters of the linear learning mo...
The author presents a new method for estimoting the parameters of the linear learning model. The pro...
AbstractIt is a common procedure for scattered data approximation to use local polynomial fitting in...
We introduce moving least squares approximation as an approximation scheme on the sphere. We prove e...
We follow a learning theory viewpoint to study a family of learning schemes for regression related t...
AbstractA standard assumption in theoretical study of learning algorithms for regression is uniform ...
AbstractWe study in detail the behavior of some known learning algorithms. We estimate the sum of th...
Abstract: Owing to the meshless and local character-istics, moving least squares (MLS) methods have ...
AbstractMoving least-square (MLS) is an approximation method for data interpolation, numerical analy...
Abstract We consider the moving least-squares (MLS) method by the regression learning framework unde...
AbstractIn this paper we apply a concentration technique to improve the convergence rates for a movi...
Abstract We consider the moving least-square (MLS) method by the coefficient-based regression framew...
Abstract. We describe two experiments recently conducted with the approximate moving least squares (...
The concise review systematically summarises the state-of-the-art variants of Moving Least Squares (...
The Moving Least Square (MLS) Method is an approach which is used in meshfree solutions and data app...
textabstractThe author presents a new method for estimating the parameters of the linear learning mo...
The author presents a new method for estimoting the parameters of the linear learning model. The pro...
AbstractIt is a common procedure for scattered data approximation to use local polynomial fitting in...
We introduce moving least squares approximation as an approximation scheme on the sphere. We prove e...
We follow a learning theory viewpoint to study a family of learning schemes for regression related t...
AbstractA standard assumption in theoretical study of learning algorithms for regression is uniform ...
AbstractWe study in detail the behavior of some known learning algorithms. We estimate the sum of th...
Abstract: Owing to the meshless and local character-istics, moving least squares (MLS) methods have ...