Likelihood based methods for obtaining approximate confidence intervals for the slope in a simple linear regression when both variables are measured with error are discussed and compared. We also derive a Bartlett correction factor for the likelihood ratio statistic. The improvement of the chi-squared approximation is appreciable in the unknown variance case. Further, the distribution of the maximum likelihood slope estimator is obtained from the joint distribution of sample variances and covariance
For data consisting of cross sections of units observed over time, the Error Component Regression (E...
We present a new, efficient maximum empirical likelihood estimator for the slope in linear regressio...
Consider the problem of testing a linear hypothesis of regression coefficients in a general linear r...
M.Sc.In this study we consider the problem ofestiniating the slope in the simple linear errors-in-va...
M.Sc.In this study we consider the problem ofestiniating the slope in the simple linear errors-in-va...
Linear regression for a model with a known error variance is examined, from the point of view of the...
AbstractNonparametric versions of Wilks′ theorem are proved for empirical likelihood estimators of s...
When the errors are normally independently distributed with equal variance, the maximum likelihood e...
When the errors are normally independently distributed with equal variance, the maximum likelihood e...
When the errors are normally independently distributed with equal variance, the maximum likelihood e...
In this paper, different approaches to dealing with nuisance parameters in likelihood based inferenc...
AbstractNonparametric versions of Wilks′ theorem are proved for empirical likelihood estimators of s...
Today increasing amounts of data are available for analysis purposes and often times for resource al...
In this paper we investigate the empirical likelihood method in a linear regression model when the o...
summary The estimation of the slope parameter of the linear regression model with normal error is co...
For data consisting of cross sections of units observed over time, the Error Component Regression (E...
We present a new, efficient maximum empirical likelihood estimator for the slope in linear regressio...
Consider the problem of testing a linear hypothesis of regression coefficients in a general linear r...
M.Sc.In this study we consider the problem ofestiniating the slope in the simple linear errors-in-va...
M.Sc.In this study we consider the problem ofestiniating the slope in the simple linear errors-in-va...
Linear regression for a model with a known error variance is examined, from the point of view of the...
AbstractNonparametric versions of Wilks′ theorem are proved for empirical likelihood estimators of s...
When the errors are normally independently distributed with equal variance, the maximum likelihood e...
When the errors are normally independently distributed with equal variance, the maximum likelihood e...
When the errors are normally independently distributed with equal variance, the maximum likelihood e...
In this paper, different approaches to dealing with nuisance parameters in likelihood based inferenc...
AbstractNonparametric versions of Wilks′ theorem are proved for empirical likelihood estimators of s...
Today increasing amounts of data are available for analysis purposes and often times for resource al...
In this paper we investigate the empirical likelihood method in a linear regression model when the o...
summary The estimation of the slope parameter of the linear regression model with normal error is co...
For data consisting of cross sections of units observed over time, the Error Component Regression (E...
We present a new, efficient maximum empirical likelihood estimator for the slope in linear regressio...
Consider the problem of testing a linear hypothesis of regression coefficients in a general linear r...