In the presence of a nuisance parameter the asymptotic deficiency of the discretizedlikelihood estimator (DLE) relative to the bias-adjusted maximum likelihood estimatoris obtained under the assumed model. It consists of two parts. One is the lossof information associated with the DLE of the parameter to be estimated. Another,is that due to the "incorrectness" of the assumed model. Some examples on the normaland Weibull type distributions are given
AbstractStatistical analyses commonly make use of models that suffer from loss of identifiability. I...
This article develops a theory of maximum empirical likelihood estimation and empirical likelihood r...
We analyze optimality properties of maximum likelihood (ML) and other estimators when the problem do...
The problem on jackknifing estimators is investigated in the presence of nuisanceparameters from the...
In practice, nuisance parameters in statistical models are often replaced by estimates based on an e...
For a truncated exponential family of distributions with a truncation parameter γ and a natural para...
In this study we check the asymptotic efficiency of empirical likelihood in the presence of nuisance...
AbstractSuppose that independent observations come from an unspecified unknown distribution. Then we...
Plug-in estimation and corresponding refinements involving penalisation have been considered in vari...
Maximum likelihood estimation is a standard approach when confronted with the task of finding estima...
Since statistical models are simplifications of reality, it is important in estimation theory to stu...
The problem of parameter estimation by the continuous time observations of a deterministic signal in...
For a truncated exponential family of distributions with a natural parameter θ and a truncation para...
This papers studies and compares the asymptotic bias of GMM and generalized empirical likelihood (GE...
More than thirty years ago Halbert White inaugurated a “modelrobust” form of statistical inference b...
AbstractStatistical analyses commonly make use of models that suffer from loss of identifiability. I...
This article develops a theory of maximum empirical likelihood estimation and empirical likelihood r...
We analyze optimality properties of maximum likelihood (ML) and other estimators when the problem do...
The problem on jackknifing estimators is investigated in the presence of nuisanceparameters from the...
In practice, nuisance parameters in statistical models are often replaced by estimates based on an e...
For a truncated exponential family of distributions with a truncation parameter γ and a natural para...
In this study we check the asymptotic efficiency of empirical likelihood in the presence of nuisance...
AbstractSuppose that independent observations come from an unspecified unknown distribution. Then we...
Plug-in estimation and corresponding refinements involving penalisation have been considered in vari...
Maximum likelihood estimation is a standard approach when confronted with the task of finding estima...
Since statistical models are simplifications of reality, it is important in estimation theory to stu...
The problem of parameter estimation by the continuous time observations of a deterministic signal in...
For a truncated exponential family of distributions with a natural parameter θ and a truncation para...
This papers studies and compares the asymptotic bias of GMM and generalized empirical likelihood (GE...
More than thirty years ago Halbert White inaugurated a “modelrobust” form of statistical inference b...
AbstractStatistical analyses commonly make use of models that suffer from loss of identifiability. I...
This article develops a theory of maximum empirical likelihood estimation and empirical likelihood r...
We analyze optimality properties of maximum likelihood (ML) and other estimators when the problem do...