Kappa range interpretation: <0 = Disagreement, 0–0.2 = poor agreement, 0.21–0.4 = fair agreement, 0.41–0.6 = moderate agreement, 0.61–0.8 = substantial agreement, 0.81–1 = almost perfect agreement.</p
BACKGROUND: Several criteria have been used to assess agreement between replicate slide readings of...
<p>Pair wise concordance values calculated using the kappa coefficient for parasitology, serology an...
<p>Agreement statistics estimation (Kappa coefficient and Bowker symmetry test) comparing species-sp...
*<p>k<0 indicating no agreement, k = 0–0.2 indicating poor agreement, k = 0.21–0.4 indicating fair a...
Measure of agreement presented as Cohens Kappa coefficients between molecular and serological method...
Agreement (Kappa value) of diagnosis between the FECT and antigen detection methods.</p
<p>All agreements p<0.0001.</p><p>Interobserver agreement (Kappa test) of benign, probably benign, p...
<p>Comparison of the serological tests with the consensus of infection status (<a href="http://www.p...
<p>Positive and negative agreement: Standard microbiology methods vs. xTAG (Benchmark).</p
Agreement of assay results upon comparative testing of field-collected mosquito samples.</p
<p>All cases (n = 256).</p><p>CI: confidence interval.</p>*<p>The strength of agreement between ea...
<p>Overall test agreement (HR-HPV Positive <i>vs.</i> Negative) <i>kappa value = </i>0.957.</p
The positive and negative agreement of the study assays in specimens with presumed B. burgdorferi po...
<p>Scoring ranged from 1 = strongly disagree to 6 = strongly agree. Dots, triangles and squares repr...
Measuring agreement between qualified experts is commonly used to determine the effec-tiveness of a ...
BACKGROUND: Several criteria have been used to assess agreement between replicate slide readings of...
<p>Pair wise concordance values calculated using the kappa coefficient for parasitology, serology an...
<p>Agreement statistics estimation (Kappa coefficient and Bowker symmetry test) comparing species-sp...
*<p>k<0 indicating no agreement, k = 0–0.2 indicating poor agreement, k = 0.21–0.4 indicating fair a...
Measure of agreement presented as Cohens Kappa coefficients between molecular and serological method...
Agreement (Kappa value) of diagnosis between the FECT and antigen detection methods.</p
<p>All agreements p<0.0001.</p><p>Interobserver agreement (Kappa test) of benign, probably benign, p...
<p>Comparison of the serological tests with the consensus of infection status (<a href="http://www.p...
<p>Positive and negative agreement: Standard microbiology methods vs. xTAG (Benchmark).</p
Agreement of assay results upon comparative testing of field-collected mosquito samples.</p
<p>All cases (n = 256).</p><p>CI: confidence interval.</p>*<p>The strength of agreement between ea...
<p>Overall test agreement (HR-HPV Positive <i>vs.</i> Negative) <i>kappa value = </i>0.957.</p
The positive and negative agreement of the study assays in specimens with presumed B. burgdorferi po...
<p>Scoring ranged from 1 = strongly disagree to 6 = strongly agree. Dots, triangles and squares repr...
Measuring agreement between qualified experts is commonly used to determine the effec-tiveness of a ...
BACKGROUND: Several criteria have been used to assess agreement between replicate slide readings of...
<p>Pair wise concordance values calculated using the kappa coefficient for parasitology, serology an...
<p>Agreement statistics estimation (Kappa coefficient and Bowker symmetry test) comparing species-sp...