<p><sup>1</sup><i>p</i><0.001</p><p>Inter-rater agreements of C-AEP reviewers.</p
International audienceAgreement between observers (i.e., inter-rater agreement) can be quantified wi...
<p>Agreement (Cohen’s Kappa) between each pair of examination method based on ROP-RT.</p
<p>Agreement and corresponding kappa coefficient between readers for EORCT and PERCIST.</p
<p><sup>1</sup><i>p</i><0.001</p><p>Inter-rater agreement of physician raters and C-AEP reviewers.</...
<p>Inter-rater agreement and its kappa value on important figure text (95% confidence).</p
<p>Inter-rater- and inter-method-agreement (ICC) for the different subject groups (without discarded...
<p>Inter-rater agreement (Cronbach's alpha) for trait ratings of faces and bodies.</p
<p><sup>a</sup> S.E. = standard error</p><p>Inter-rater Agreement of Pairs of Index Screening Instru...
<p>Figures are observed percent agreement and kappa statistic for independent rating of 202 codes by...
<p>Intra-rater agreement for the 6-group geometric classification: 3 evaluators.</p
<p>Inter-observer agreement between two senior graders and intra-observer agreement for senior grade...
The evaluation of the reviewers shows a significantly better performance of the teams after the cour...
Inter-rater reliability indices as assessed by percent agreement and Krippendorff’s alpha (n = 87).<...
Inter-rater agreement between CDAI and DAS 28 at 3-time points of disease activity assessment.</p
<p>Inter-annotator agreement <i>F</i><sub>1</sub> measure of the top 1, 5, and 10 topics according t...
International audienceAgreement between observers (i.e., inter-rater agreement) can be quantified wi...
<p>Agreement (Cohen’s Kappa) between each pair of examination method based on ROP-RT.</p
<p>Agreement and corresponding kappa coefficient between readers for EORCT and PERCIST.</p
<p><sup>1</sup><i>p</i><0.001</p><p>Inter-rater agreement of physician raters and C-AEP reviewers.</...
<p>Inter-rater agreement and its kappa value on important figure text (95% confidence).</p
<p>Inter-rater- and inter-method-agreement (ICC) for the different subject groups (without discarded...
<p>Inter-rater agreement (Cronbach's alpha) for trait ratings of faces and bodies.</p
<p><sup>a</sup> S.E. = standard error</p><p>Inter-rater Agreement of Pairs of Index Screening Instru...
<p>Figures are observed percent agreement and kappa statistic for independent rating of 202 codes by...
<p>Intra-rater agreement for the 6-group geometric classification: 3 evaluators.</p
<p>Inter-observer agreement between two senior graders and intra-observer agreement for senior grade...
The evaluation of the reviewers shows a significantly better performance of the teams after the cour...
Inter-rater reliability indices as assessed by percent agreement and Krippendorff’s alpha (n = 87).<...
Inter-rater agreement between CDAI and DAS 28 at 3-time points of disease activity assessment.</p
<p>Inter-annotator agreement <i>F</i><sub>1</sub> measure of the top 1, 5, and 10 topics according t...
International audienceAgreement between observers (i.e., inter-rater agreement) can be quantified wi...
<p>Agreement (Cohen’s Kappa) between each pair of examination method based on ROP-RT.</p
<p>Agreement and corresponding kappa coefficient between readers for EORCT and PERCIST.</p