<p>The slice rejection criterion of the gold standard was changed for 1% (<b>a</b>), 1.5% (<b>b</b>), 2.0% (<b>c</b>), and 2.5% (<b>d</b>) of the FOV, and the performances of GMM (red), GMM-cISID (blue), and cISID (black) were evaluated.</p
<p>Each point in the ROC curve is obtained by choosing a different threshold for calling differentia...
<p>Following the notation in the paper: PBF (dark green), DS (orange), VAR (blue), DM (pink), TVM (l...
<p>(A) ROC curves for Multifind, Multifind trained without ensemble defect Z score, RNAz, LocaRNATE+...
<p>Each plot represents a ROC curve as we vary the global BF threshold for BRVD and BRI, and vary th...
Areas under the curve (AUC)-values (95% CI) are 0.7679 (95% CI 0.64768 to 0.88812), 0.864 (95% CI 0....
<p>Each plot represents a ROC curve as we vary the global BF threshold for BRVD and BRI, and vary th...
(A) Performance of the model in the training set, which showed an AUC value of 0.768, an optimal cut...
<p>The areas under the ROC curves, or AUC are 0.93, 0.96, 0.90, and 0.94 for iMcRNA-PseSSC, iMcRNA-E...
<p>The ROC curves for evaluating the quality of the algorithms over the entire test datasets.</p
Areas under the curve (AUC)-values (95% CI) are 0.7679 (95% CI 0.64768 to 0.88812), 0.8648 (95% CI 0...
<p>This figure plots the ROC curve for the comparison between GGA-method and MDS-method using the CS...
<p>ROC Curve of different classification methods (SVM only, GA+SVM, GA+SVM+Post Spike Matching).</p
<p>ROC curves showing true-positive rates (sensitivity) plotted against the false-positive rate for ...
<p>In each figure, the solid (<i>blue</i>) curve corresponds to the cross validation test on the sam...
<p>For each threshold, we build a linear formula taking corresponding features and train dataset. Th...
<p>Each point in the ROC curve is obtained by choosing a different threshold for calling differentia...
<p>Following the notation in the paper: PBF (dark green), DS (orange), VAR (blue), DM (pink), TVM (l...
<p>(A) ROC curves for Multifind, Multifind trained without ensemble defect Z score, RNAz, LocaRNATE+...
<p>Each plot represents a ROC curve as we vary the global BF threshold for BRVD and BRI, and vary th...
Areas under the curve (AUC)-values (95% CI) are 0.7679 (95% CI 0.64768 to 0.88812), 0.864 (95% CI 0....
<p>Each plot represents a ROC curve as we vary the global BF threshold for BRVD and BRI, and vary th...
(A) Performance of the model in the training set, which showed an AUC value of 0.768, an optimal cut...
<p>The areas under the ROC curves, or AUC are 0.93, 0.96, 0.90, and 0.94 for iMcRNA-PseSSC, iMcRNA-E...
<p>The ROC curves for evaluating the quality of the algorithms over the entire test datasets.</p
Areas under the curve (AUC)-values (95% CI) are 0.7679 (95% CI 0.64768 to 0.88812), 0.8648 (95% CI 0...
<p>This figure plots the ROC curve for the comparison between GGA-method and MDS-method using the CS...
<p>ROC Curve of different classification methods (SVM only, GA+SVM, GA+SVM+Post Spike Matching).</p
<p>ROC curves showing true-positive rates (sensitivity) plotted against the false-positive rate for ...
<p>In each figure, the solid (<i>blue</i>) curve corresponds to the cross validation test on the sam...
<p>For each threshold, we build a linear formula taking corresponding features and train dataset. Th...
<p>Each point in the ROC curve is obtained by choosing a different threshold for calling differentia...
<p>Following the notation in the paper: PBF (dark green), DS (orange), VAR (blue), DM (pink), TVM (l...
<p>(A) ROC curves for Multifind, Multifind trained without ensemble defect Z score, RNAz, LocaRNATE+...