<p>Legend: 95% CI: 95% confidence interval</p><p>Kappa measurements of intra- and inter-observer agreement with regard to diastolic dysfunction.</p
We signal and discuss common methodological errors in agreement studies and the use of kappa indices...
<p>Inter-observer agreement: values of Krippendorf's alpha coefficient of individual muscles with 95...
Reported values are percentage (95% confidence interval), with Cohen’s kappa (95% confidence interva...
<p>R1, first reading; R2, second reading; CI, confidence interval; Max, maximum.</p><p>Kappa coeffic...
<p>Fig 2 shows the values of kappa for intra-rater (dark blue) and for inter-rater (light blue) reli...
Abstract:Kappa statistics is used for the assessment of agreement between two or more raters when th...
The kappa statistic is frequently used to test interrater reliability. The importance of rater relia...
<p>Fig 3a and 3b show the values of kappa compared to the the values obtained by calculating the Pre...
<p>Intra- and inter- observer reproducibility of the histological scoring in skin and muscle. All fi...
<p>Cohen's kappa and its 95% confidence intervals calculated for each RDT between duplicate devices ...
<p>Inter-rater agreement and its kappa value on important figure text (95% confidence).</p
<p>Percentage Agreement and Kappa (κ) Statistic for Each SDOCT Feature of DME for the Foveal Line Sc...
The kappa statistic is commonly used for quantifying inter-rater agreement on a nominal scale. In th...
Weighted kappa coefficients (95% CI) assessing agreement between the readers.</p
<p>Cohen's kappa and its 95% confidence intervals calculated for each RDT between duplicate devices ...
We signal and discuss common methodological errors in agreement studies and the use of kappa indices...
<p>Inter-observer agreement: values of Krippendorf's alpha coefficient of individual muscles with 95...
Reported values are percentage (95% confidence interval), with Cohen’s kappa (95% confidence interva...
<p>R1, first reading; R2, second reading; CI, confidence interval; Max, maximum.</p><p>Kappa coeffic...
<p>Fig 2 shows the values of kappa for intra-rater (dark blue) and for inter-rater (light blue) reli...
Abstract:Kappa statistics is used for the assessment of agreement between two or more raters when th...
The kappa statistic is frequently used to test interrater reliability. The importance of rater relia...
<p>Fig 3a and 3b show the values of kappa compared to the the values obtained by calculating the Pre...
<p>Intra- and inter- observer reproducibility of the histological scoring in skin and muscle. All fi...
<p>Cohen's kappa and its 95% confidence intervals calculated for each RDT between duplicate devices ...
<p>Inter-rater agreement and its kappa value on important figure text (95% confidence).</p
<p>Percentage Agreement and Kappa (κ) Statistic for Each SDOCT Feature of DME for the Foveal Line Sc...
The kappa statistic is commonly used for quantifying inter-rater agreement on a nominal scale. In th...
Weighted kappa coefficients (95% CI) assessing agreement between the readers.</p
<p>Cohen's kappa and its 95% confidence intervals calculated for each RDT between duplicate devices ...
We signal and discuss common methodological errors in agreement studies and the use of kappa indices...
<p>Inter-observer agreement: values of Krippendorf's alpha coefficient of individual muscles with 95...
Reported values are percentage (95% confidence interval), with Cohen’s kappa (95% confidence interva...