<p>The F1, F2, and F3 scores were computed for every point on the ROC curve from the training dataset and Plot <b>(a)</b> illustrates thresholds where the scores were maximum. The Out of Bag samples during training were used to obtain the ROC curve for RF since the performance on the training data was perfect (AUC 1.00). Plot <b>(b)</b> represents ROC of the test data where thresholds chosen from the training data are marked.</p
<p>Receiver Operating Characteristic (ROC) curve for threshold of likelihood ratio test: Benchmarks ...
<p>The area under the curve (AUC) is 0.76, suggesting a strong ability to discriminate between true ...
<p>The ROC curve showing the tradeoff between the True Positive Rate (sensitivity) and the False Pos...
<p>Receiver Operating Characteristic (ROC) curve and the Area Under the Curve (AUC) values for train...
<p>Sensitivity and specificity values were obtained for increasing classification thresholds to prod...
<p>(A) ROC curves for Multifind, Multifind trained without ensemble defect Z score, RNAz, LocaRNATE+...
<p>(A) ROC curves for Multifind, Multifind trained without ensemble defect Z score, RNAz, LocaRNATE+...
<p>(A) ROC curves for Multifind, Multifind trained without ensemble defect Z score, RNAz, LocaRNATE+...
<p>Receiver operator characteristic (ROC) curves by AJCC TNM stage for the three biomarker model, fi...
<p>ROC curves of TPR versus FPR for optimal sets of biomarkers where averaged over SSVM models. T...
(A) Performance of the model in the training set, which showed an AUC value of 0.768, an optimal cut...
<p>Receiver operating characteristic curve (ROC) shows that the areas under ROC are approximately 0....
The ROC curves in the training (A), internal validation (B) and external validation (C) groups. The ...
<p>Receiver operating characteristic (ROC curves) obtained for the 4 selected predictors, PAFIG, SAL...
<p>The areas under the ROC curves, or AUC are 0.93, 0.96, 0.90, and 0.94 for iMcRNA-PseSSC, iMcRNA-E...
<p>Receiver Operating Characteristic (ROC) curve for threshold of likelihood ratio test: Benchmarks ...
<p>The area under the curve (AUC) is 0.76, suggesting a strong ability to discriminate between true ...
<p>The ROC curve showing the tradeoff between the True Positive Rate (sensitivity) and the False Pos...
<p>Receiver Operating Characteristic (ROC) curve and the Area Under the Curve (AUC) values for train...
<p>Sensitivity and specificity values were obtained for increasing classification thresholds to prod...
<p>(A) ROC curves for Multifind, Multifind trained without ensemble defect Z score, RNAz, LocaRNATE+...
<p>(A) ROC curves for Multifind, Multifind trained without ensemble defect Z score, RNAz, LocaRNATE+...
<p>(A) ROC curves for Multifind, Multifind trained without ensemble defect Z score, RNAz, LocaRNATE+...
<p>Receiver operator characteristic (ROC) curves by AJCC TNM stage for the three biomarker model, fi...
<p>ROC curves of TPR versus FPR for optimal sets of biomarkers where averaged over SSVM models. T...
(A) Performance of the model in the training set, which showed an AUC value of 0.768, an optimal cut...
<p>Receiver operating characteristic curve (ROC) shows that the areas under ROC are approximately 0....
The ROC curves in the training (A), internal validation (B) and external validation (C) groups. The ...
<p>Receiver operating characteristic (ROC curves) obtained for the 4 selected predictors, PAFIG, SAL...
<p>The areas under the ROC curves, or AUC are 0.93, 0.96, 0.90, and 0.94 for iMcRNA-PseSSC, iMcRNA-E...
<p>Receiver Operating Characteristic (ROC) curve for threshold of likelihood ratio test: Benchmarks ...
<p>The area under the curve (AUC) is 0.76, suggesting a strong ability to discriminate between true ...
<p>The ROC curve showing the tradeoff between the True Positive Rate (sensitivity) and the False Pos...