<p>In each figure, the solid (<i>blue</i>) curve corresponds to the cross validation test on the same dataset and the dotted (<i>red</i>) curve corresponds to the independent test. The AUC values given in each figure correspond to the values in <a href="http://www.plosone.org/article/info:doi/10.1371/journal.pone.0141551#pone.0141551.t006" target="_blank">Table 6</a>. The x-axis and y-axis show the Specificity and Sensitivity, respectively.</p
<p>The blue line indicates the prediction scenario using only clinical variables (hippocampal sclero...
<p>The model gives a high receiver operating curve (AUC) of 0.9277 for the training set and 0.9900 f...
We compute the area under the curve (AUC) for each model and each cohort, where a perfect classifier...
<p>The solid curve is the ROC curve for the prediction model established on the basis of the trainin...
<p>A) Discovery data (32,587 SNPs) B) Combined data (32,375 SNPs). Ten sets of training and testing ...
<p>A ROC curve plots the true positive rate (i.e., sensitivity) against the false positive rate (i.e...
<p>Training (blue), verification (purple), and validation (red) study ROC curves are plotted with co...
<p>(A) ROC curve of the 349-gene predictive model in training set (200 samples, AUC = 0.826; <i>p<</...
<p>ROC plot depicts a relative trade-off between true positive rate and false positive rate of the p...
(A) Performance of the model in the training set, which showed an AUC value of 0.768, an optimal cut...
<p>The AUC values shown in the figure correspond to the values in <a href="http://www.plosone.org/ar...
<p>ROC curves of cross-validated prediction for the super learner (SL) and the logistic step-wise re...
<p>ROC curves (a plot of true positive rate (Sensitivity) against false positive rate (1-Specificity...
The ROC curves in the training (A), internal validation (B) and external validation (C) groups. The ...
<p><b>(a)</b> The ensemble-based prediction model based on all five combined patterns has an area un...
<p>The blue line indicates the prediction scenario using only clinical variables (hippocampal sclero...
<p>The model gives a high receiver operating curve (AUC) of 0.9277 for the training set and 0.9900 f...
We compute the area under the curve (AUC) for each model and each cohort, where a perfect classifier...
<p>The solid curve is the ROC curve for the prediction model established on the basis of the trainin...
<p>A) Discovery data (32,587 SNPs) B) Combined data (32,375 SNPs). Ten sets of training and testing ...
<p>A ROC curve plots the true positive rate (i.e., sensitivity) against the false positive rate (i.e...
<p>Training (blue), verification (purple), and validation (red) study ROC curves are plotted with co...
<p>(A) ROC curve of the 349-gene predictive model in training set (200 samples, AUC = 0.826; <i>p<</...
<p>ROC plot depicts a relative trade-off between true positive rate and false positive rate of the p...
(A) Performance of the model in the training set, which showed an AUC value of 0.768, an optimal cut...
<p>The AUC values shown in the figure correspond to the values in <a href="http://www.plosone.org/ar...
<p>ROC curves of cross-validated prediction for the super learner (SL) and the logistic step-wise re...
<p>ROC curves (a plot of true positive rate (Sensitivity) against false positive rate (1-Specificity...
The ROC curves in the training (A), internal validation (B) and external validation (C) groups. The ...
<p><b>(a)</b> The ensemble-based prediction model based on all five combined patterns has an area un...
<p>The blue line indicates the prediction scenario using only clinical variables (hippocampal sclero...
<p>The model gives a high receiver operating curve (AUC) of 0.9277 for the training set and 0.9900 f...
We compute the area under the curve (AUC) for each model and each cohort, where a perfect classifier...