<p><i>Model<sub>ND</sub></i> is trained on <i>Positive<sub>H</sub></i>, <i>Positive<sub>C</sub></i>, and <i>Negative<sub>ND</sub></i>. <i>Model<sub>NDNN</sub></i> is trained using the same positive set, and a negative set that includes <i>Negative<sub>ND</sub></i> together with <i>Negative<sub>NN</sub></i>. Accuracy (Acc), precision (Pre), and recall (Rec) were calculated for both kernerls, RBF and Polynomial.<...
The group of red solid lines represents the performance of models trained on the augmented training ...
<p>The data set is partitioned into 10 parts (folds) in the outer loop. One fold of the data set is ...
Abstract Background Cross-validation (CV) is an effective method for estimating the prediction error...
<p>Total 1783 cysteine sequences were applied in positive and negative data. Sn, sensitivity; Sp, sp...
<p>The experiment was conducted 10 times using 10-fold cross-validation performed on the training se...
<p>Different conditions are considered in each row. The first row is the standard cross-validation (...
<p>(a) Values for TPR, TNR, PPV, and NPV. (b) Values for MCC, BACC, AUC, and ACC.</p
<p>(<b>a</b>) The results obtained for the 1<sup>st</sup>-level prediction. (b) The results obtained...
Accuracy comparisons of ten experiments with SVM, KNN, BPNN, CNN, ResNet, FA+ResNet models.</p
<p>The Performance of SVM Models on validation dataset with experimentally derived binding affinity ...
<p>Different conditions are considered in each row. The first row is the standard cross-validation (...
<p>Using binary patterns and AA (amino acid) composition [γ <b>(g)</b> (in RBF kernel), c: parameter...
<p>(a) TPR, (b) NPR, (c) PPV, and (d) NPV of five SVM models trained on 5 different data sets (train...
Numerous functions were available in the construction of Multi-Layer Perceptron Neural Network algor...
<p>Evaluation of multiple classification models including Support Vector Machine (SVM), Random Fores...
The group of red solid lines represents the performance of models trained on the augmented training ...
<p>The data set is partitioned into 10 parts (folds) in the outer loop. One fold of the data set is ...
Abstract Background Cross-validation (CV) is an effective method for estimating the prediction error...
<p>Total 1783 cysteine sequences were applied in positive and negative data. Sn, sensitivity; Sp, sp...
<p>The experiment was conducted 10 times using 10-fold cross-validation performed on the training se...
<p>Different conditions are considered in each row. The first row is the standard cross-validation (...
<p>(a) Values for TPR, TNR, PPV, and NPV. (b) Values for MCC, BACC, AUC, and ACC.</p
<p>(<b>a</b>) The results obtained for the 1<sup>st</sup>-level prediction. (b) The results obtained...
Accuracy comparisons of ten experiments with SVM, KNN, BPNN, CNN, ResNet, FA+ResNet models.</p
<p>The Performance of SVM Models on validation dataset with experimentally derived binding affinity ...
<p>Different conditions are considered in each row. The first row is the standard cross-validation (...
<p>Using binary patterns and AA (amino acid) composition [γ <b>(g)</b> (in RBF kernel), c: parameter...
<p>(a) TPR, (b) NPR, (c) PPV, and (d) NPV of five SVM models trained on 5 different data sets (train...
Numerous functions were available in the construction of Multi-Layer Perceptron Neural Network algor...
<p>Evaluation of multiple classification models including Support Vector Machine (SVM), Random Fores...
The group of red solid lines represents the performance of models trained on the augmented training ...
<p>The data set is partitioned into 10 parts (folds) in the outer loop. One fold of the data set is ...
Abstract Background Cross-validation (CV) is an effective method for estimating the prediction error...