<p><sup>1</sup><i>p</i><0.001</p><p>Inter-rater agreement of physician raters and C-AEP reviewers.</p
<p>Intra-rater agreement for the 6-group geometric classification: 3 evaluators.</p
<p>Cohen’s kappa coefficient for inter-rater agreement between data obtained from the IVR system and...
<p>The left orbits of 10 patients were evaluated by either two raters or the same rater twice (n = 1...
<p><sup>1</sup><i>p</i><0.001</p><p>Inter-rater agreements of C-AEP reviewers.</p
International audienceAgreement between observers (i.e., inter-rater agreement) can be quantified wi...
<p><sup>a</sup> S.E. = standard error</p><p>Inter-rater Agreement of Pairs of Index Screening Instru...
<p>Inter-observer agreement between the two primary readers on pathological features and on radiolog...
<p>Inter-rater agreement and its kappa value on important figure text (95% confidence).</p
<p>Inter-rater agreement for the 6-group geometric classification: 3 evaluators at occasion 1 and oc...
<p>Inter-rater- and inter-method-agreement (ICC) for the different subject groups (without discarded...
Inter-rater agreement between CDAI and DAS 28 at 3-time points of disease activity assessment.</p
<p>Inter-observer agreement level for each finding in patients with esophageal achalasia.</p
<p>Inter-rater agreement (Cronbach's alpha) for trait ratings of faces and bodies.</p
<p>Values show κ coefficients. A, B and C refers to each reviewer. CC, corpus callosum; RC, rostral ...
Inter-rater reliability indices as assessed by percent agreement and Krippendorff’s alpha (n = 87).<...
<p>Intra-rater agreement for the 6-group geometric classification: 3 evaluators.</p
<p>Cohen’s kappa coefficient for inter-rater agreement between data obtained from the IVR system and...
<p>The left orbits of 10 patients were evaluated by either two raters or the same rater twice (n = 1...
<p><sup>1</sup><i>p</i><0.001</p><p>Inter-rater agreements of C-AEP reviewers.</p
International audienceAgreement between observers (i.e., inter-rater agreement) can be quantified wi...
<p><sup>a</sup> S.E. = standard error</p><p>Inter-rater Agreement of Pairs of Index Screening Instru...
<p>Inter-observer agreement between the two primary readers on pathological features and on radiolog...
<p>Inter-rater agreement and its kappa value on important figure text (95% confidence).</p
<p>Inter-rater agreement for the 6-group geometric classification: 3 evaluators at occasion 1 and oc...
<p>Inter-rater- and inter-method-agreement (ICC) for the different subject groups (without discarded...
Inter-rater agreement between CDAI and DAS 28 at 3-time points of disease activity assessment.</p
<p>Inter-observer agreement level for each finding in patients with esophageal achalasia.</p
<p>Inter-rater agreement (Cronbach's alpha) for trait ratings of faces and bodies.</p
<p>Values show κ coefficients. A, B and C refers to each reviewer. CC, corpus callosum; RC, rostral ...
Inter-rater reliability indices as assessed by percent agreement and Krippendorff’s alpha (n = 87).<...
<p>Intra-rater agreement for the 6-group geometric classification: 3 evaluators.</p
<p>Cohen’s kappa coefficient for inter-rater agreement between data obtained from the IVR system and...
<p>The left orbits of 10 patients were evaluated by either two raters or the same rater twice (n = 1...