<p>The interobserver reliability (κ statistics) and agreement (proportion of agreement) for BI-RADS MRI classifications.</p
<p>Measurements from both the first and second MRI scan were included in this analysis. Bland-Altman...
Purpose: The liver imaging reporting and data system (LI-RADS) is a structured reporting system that...
<p>Association between volumetric breast density estimates per study and BI-RADS category.</p
<p>Interobserver reliability for each radiologic findings of MAC lung disease assessd by interclass ...
<p>ᴋ<sub>w</sub> = weighted kappa scores (Fleiss-Cohen, quadratic weights)</p><p><sup>a</sup> With t...
International audienceAgreement between observers (i.e., inter-rater agreement) can be quantified wi...
Interobserver and intraobserver reliability for measurement of radiographic parameters.</p
Inter-rater reliability indices as assessed by percent agreement and Krippendorff’s alpha (n = 87).<...
<p>ᴋ<sub>w</sub> = weighted kappa scores (Fleiss-Cohen, quadratic weights), CI = confidence interval...
<p>CI, confidence interval.</p><p>Interobserver agreement concerning imaging analyses of rectal MRI ...
<p>The inter-observer differences for the main continuous radiological features are obtained through...
Intraclass correlation coefficient (ICC) inter-observer agreement for MRI- and US- derived prostate ...
<p>The level of agreement is represented by kappa (κ) value calculated using DWI, MTT, and D-M ASPEC...
<p>Frequency of studies scored with BI-RADS categories 1, 2, 3 and 4 for (a) the complete dataset (n...
<p>* The scale for κ values was as follows: 0.21–0.40, fair agreement; 0.41–0.60, moderate agreement...
<p>Measurements from both the first and second MRI scan were included in this analysis. Bland-Altman...
Purpose: The liver imaging reporting and data system (LI-RADS) is a structured reporting system that...
<p>Association between volumetric breast density estimates per study and BI-RADS category.</p
<p>Interobserver reliability for each radiologic findings of MAC lung disease assessd by interclass ...
<p>ᴋ<sub>w</sub> = weighted kappa scores (Fleiss-Cohen, quadratic weights)</p><p><sup>a</sup> With t...
International audienceAgreement between observers (i.e., inter-rater agreement) can be quantified wi...
Interobserver and intraobserver reliability for measurement of radiographic parameters.</p
Inter-rater reliability indices as assessed by percent agreement and Krippendorff’s alpha (n = 87).<...
<p>ᴋ<sub>w</sub> = weighted kappa scores (Fleiss-Cohen, quadratic weights), CI = confidence interval...
<p>CI, confidence interval.</p><p>Interobserver agreement concerning imaging analyses of rectal MRI ...
<p>The inter-observer differences for the main continuous radiological features are obtained through...
Intraclass correlation coefficient (ICC) inter-observer agreement for MRI- and US- derived prostate ...
<p>The level of agreement is represented by kappa (κ) value calculated using DWI, MTT, and D-M ASPEC...
<p>Frequency of studies scored with BI-RADS categories 1, 2, 3 and 4 for (a) the complete dataset (n...
<p>* The scale for κ values was as follows: 0.21–0.40, fair agreement; 0.41–0.60, moderate agreement...
<p>Measurements from both the first and second MRI scan were included in this analysis. Bland-Altman...
Purpose: The liver imaging reporting and data system (LI-RADS) is a structured reporting system that...
<p>Association between volumetric breast density estimates per study and BI-RADS category.</p