We study the asymptotics for jump-penalized least squares regression aiming at approximating a regression function by piecewise constant functions. Besides conventional consistency and convergence rates of the estimates in L2([0, 1)) our results cover other metrics like Skorokhod metric on the space of càdlàg functions and uniform metrics on C([0, 1]). We will show that these estimators are in an adaptive sense rate optimal over certain classes of "approximation spaces." Special cases are the class of functions of bounded variation (piecewise) Hölder continuous functions of order
International audienceWe prove statistical rates of convergence for kernel-based least squares regre...
This paper is concerned with the construction and analysis of a universal estimator for the regressi...
In this thesis we study adaptive methods of estimation for two particular types of statistical prob...
We study the asymptotics for jump-penalized least squares regression aiming at approximating a regre...
We study the asymptotics for jump-penalized least squares regression aiming at approximating a regre...
We study the asymptotics for jump-penalized least squares regression aiming at approximating a regre...
We study the asymptotic behavior of piecewise constant least squares regression estimates, when the ...
Abstract:We study the asymptotic behavior of piecewise constant least squares regression estimates, ...
We study the asymptotics in L2 for complexity penalized least squares regression for the discrete ap...
We study the asymptotics in L2 for complexity penalized least squares regression for the discrete ap...
We study the least squares regression function estimator over the class of real-valued functions on ...
We study the least squares regression function estimator over the class of real-valued functions on ...
In this thesis we have considered a regression model of one dimensional noisy data with the regressi...
This paper looks at the strong consistency of the ordinary least squares (OLS) estimator in linear r...
We prove rates of convergence in the statistical sense for kernel-based least squares regression usi...
International audienceWe prove statistical rates of convergence for kernel-based least squares regre...
This paper is concerned with the construction and analysis of a universal estimator for the regressi...
In this thesis we study adaptive methods of estimation for two particular types of statistical prob...
We study the asymptotics for jump-penalized least squares regression aiming at approximating a regre...
We study the asymptotics for jump-penalized least squares regression aiming at approximating a regre...
We study the asymptotics for jump-penalized least squares regression aiming at approximating a regre...
We study the asymptotic behavior of piecewise constant least squares regression estimates, when the ...
Abstract:We study the asymptotic behavior of piecewise constant least squares regression estimates, ...
We study the asymptotics in L2 for complexity penalized least squares regression for the discrete ap...
We study the asymptotics in L2 for complexity penalized least squares regression for the discrete ap...
We study the least squares regression function estimator over the class of real-valued functions on ...
We study the least squares regression function estimator over the class of real-valued functions on ...
In this thesis we have considered a regression model of one dimensional noisy data with the regressi...
This paper looks at the strong consistency of the ordinary least squares (OLS) estimator in linear r...
We prove rates of convergence in the statistical sense for kernel-based least squares regression usi...
International audienceWe prove statistical rates of convergence for kernel-based least squares regre...
This paper is concerned with the construction and analysis of a universal estimator for the regressi...
In this thesis we study adaptive methods of estimation for two particular types of statistical prob...