This work compares two mean estimators, MV and MKL, which incorporate information about a known quantile. MV minimizes variance and MKL minimizes Kulback-Leibler divergence. Both estimators are asymptotically equivalent and normally distributed but dier at nite sample sizes. Monte-Carlo simulation studies show that MV has higher mean squared error than MKL in the majority of simulated scenarios. Authors recommend using MKL when a quantile of an underlying distribution is known
This thesis considers estimation of the quantiles of the smallest extreme value distribution, someti...
AbstractThe variance of a quadratic function of the random variables in a linear model is minimized ...
Allowing for misspecification in the linear conditional quantile function, this paper provides a new...
This paper explores a class of robust estimators of normal quantiles filling the gap between maximum...
M-quantile estimators are a generalised form of quantile-like M-estimators introduced by Breckling a...
A nonlinear approximate Bayesian filter, named the minimum divergence filter (MDF), is proposed in w...
The problem of estimating the Kullback-Leibler divergence D(P||Q) between two unknown distributions ...
The problem of estimating the mean matrix of a matrix-variate normal distribution with a covariance ...
The minimum discrimination information (MDI) procedure for lowering the mean squared error (MSE) of ...
Starting with the general linear model Y=Xβ+ε where E(εε')=θ1V1+ ... +θpVp, the theory of minimum no...
In Kozek (2003) it has been shown that proper linear combinations of some M-estimators provide effic...
M-estimators introduced in Huber (1964) provide a class of robust estimators of a center of symmetry...
The minimum discrimination information (MDI) procedure for lowering the mean squared error (MSE) of ...
Quantiles are parameters of a distribution, which are of location and of scale character at the same...
The purpose of this article is to present an easy procedure to derive MVUE of a probability distribu...
This thesis considers estimation of the quantiles of the smallest extreme value distribution, someti...
AbstractThe variance of a quadratic function of the random variables in a linear model is minimized ...
Allowing for misspecification in the linear conditional quantile function, this paper provides a new...
This paper explores a class of robust estimators of normal quantiles filling the gap between maximum...
M-quantile estimators are a generalised form of quantile-like M-estimators introduced by Breckling a...
A nonlinear approximate Bayesian filter, named the minimum divergence filter (MDF), is proposed in w...
The problem of estimating the Kullback-Leibler divergence D(P||Q) between two unknown distributions ...
The problem of estimating the mean matrix of a matrix-variate normal distribution with a covariance ...
The minimum discrimination information (MDI) procedure for lowering the mean squared error (MSE) of ...
Starting with the general linear model Y=Xβ+ε where E(εε')=θ1V1+ ... +θpVp, the theory of minimum no...
In Kozek (2003) it has been shown that proper linear combinations of some M-estimators provide effic...
M-estimators introduced in Huber (1964) provide a class of robust estimators of a center of symmetry...
The minimum discrimination information (MDI) procedure for lowering the mean squared error (MSE) of ...
Quantiles are parameters of a distribution, which are of location and of scale character at the same...
The purpose of this article is to present an easy procedure to derive MVUE of a probability distribu...
This thesis considers estimation of the quantiles of the smallest extreme value distribution, someti...
AbstractThe variance of a quadratic function of the random variables in a linear model is minimized ...
Allowing for misspecification in the linear conditional quantile function, this paper provides a new...