We signal and discuss common methodological errors in agreement studies and the use of kappa indices, as found in publications in the medical and behavioural sciences. Our analysis is based on a proposed statistical model that is in line with the typical models employed in metrology and measurement theory. A first cluster of errors is related to nonrandom sampling, which results in a potentially substantial bias in the estimated agreement. Second, when class prevalences are strongly nonuniform, the use of the kappa index becomes precarious, as its large partial derivatives result in typically large standard errors of the estimates. In addition, the index reflects rather one-sidedly in such cases the consistency of the most prevalent class, ...
Master's thesis in Mathematics and physicsThis thesis examines requirements of subject sample size w...
In order to quantify the degree of agreement between raters when classifying subjects into predefine...
peer reviewedWe propose a coefficient of agreement to assess the degree of concordance between two i...
We signal and discuss common methodological errors in agreement studies and the use of kappa indices...
Some common errors of experimental design, interpretation and inference in agreement studies TP Erdm...
In various fields of science the categorization of people into categories is required. An example is...
The kappa statistic is commonly used for quantifying inter-rater agreement on a nominal scale. In th...
Cohen’s Kappa and a number of related measures can all be criticized for their definition of correct...
Background: Cohen's Kappa is the most used agreement statistic in literature. However, under certai...
Chance corrected agreement coefficients such as the Cohen and Fleiss Kappas are commonly used for th...
AbstractAgreement measures are used frequently in reliability studies that involve categorical data....
Cohen's kappa is the most widely used coefficient for assessing interobserver agreement on a nominal...
BACKGROUND: Accurate values are a must in medicine. An important parameter in determining the qualit...
Measuring agreement between qualified experts is commonly used to determine the effec-tiveness of a ...
Agreement measures are useful tools to both compare different evaluations of the same diagnostic out...
Master's thesis in Mathematics and physicsThis thesis examines requirements of subject sample size w...
In order to quantify the degree of agreement between raters when classifying subjects into predefine...
peer reviewedWe propose a coefficient of agreement to assess the degree of concordance between two i...
We signal and discuss common methodological errors in agreement studies and the use of kappa indices...
Some common errors of experimental design, interpretation and inference in agreement studies TP Erdm...
In various fields of science the categorization of people into categories is required. An example is...
The kappa statistic is commonly used for quantifying inter-rater agreement on a nominal scale. In th...
Cohen’s Kappa and a number of related measures can all be criticized for their definition of correct...
Background: Cohen's Kappa is the most used agreement statistic in literature. However, under certai...
Chance corrected agreement coefficients such as the Cohen and Fleiss Kappas are commonly used for th...
AbstractAgreement measures are used frequently in reliability studies that involve categorical data....
Cohen's kappa is the most widely used coefficient for assessing interobserver agreement on a nominal...
BACKGROUND: Accurate values are a must in medicine. An important parameter in determining the qualit...
Measuring agreement between qualified experts is commonly used to determine the effec-tiveness of a ...
Agreement measures are useful tools to both compare different evaluations of the same diagnostic out...
Master's thesis in Mathematics and physicsThis thesis examines requirements of subject sample size w...
In order to quantify the degree of agreement between raters when classifying subjects into predefine...
peer reviewedWe propose a coefficient of agreement to assess the degree of concordance between two i...