Skip to main content

Table 2 Performance evaluation of selected algorithms

From: Which are best for successful aging prediction? Bagging, boosting, or simple machine learning algorithms?

Algorithm

Hyper-parameters

Sensitivity

Specificity

Accuracy

F-score

RF

Maximum of depth = “6”

Maximum number of iterations = “50”

Number of execution slots = “1”

Bag size percent = “100”

Break Tie randomly = “True”

0.95 ± 0.01

0.94 ± 0.01

0.94 ± 0.005

0.94 ± 0.003

Bagging

Bag size percent = “100”

Classifier = “REP-Tree”

Maximum number of iterations = “15”

Number of decimal places = “2”

Number of execution slots = “1”

0.84 ± 0.02

0.84 ± 0.01

0.84 ± 0.01

0.84 ± 0.02

AdaBoost

Batch size = “100”

Classifier = “Decision Stump”

Maximum number of iterations = “20”

Weight threshold = “50”

Minimum number of instances per leafs = “1”

0.88 ± 0.01

0.86 ± 0.02

0.88 ± 0.01

0.86 ± 0.01

XG-Boost

Maximum depth = “8”

Classifier = “Decision Stump”

Base score = “4”

Min child weight = “1”

Booster = “gb-tree”

0.90 ± 0.01

0.88 ± 0.02

0.89 ± 0.015

0.88 ± 0.01

MLP

Number of hidden layers “10”

Learning rate = “0.25”

Momentum = “0.2”

Validation threshold = “20”

Maximum number of iterations = “20”

Normalize numeric values and attributes = “True”

0.77 ± 0.03

0.75 ± 0.04

0.76 ± 0.035

0.76 ± 0.035

SVM

Kernel type = “RBF”

Regularization parameters (C) = “10”

Gamma = “10”

RBF gamma = “0.1”

Degree for increasing dimensions = “3”

0.80 ± 0.03

0.79 ± 0.03

0.79 ± 0.03

0.79 ± 0.03

J-48

Confidence factor = “0.2”

Minimum number of objects = “2”

Number of folds = “3”

Binary split = “false”

Reduced error pruning = ” True”

0.72 ± 0.04

0.72 ± 0.04

0.70 ± 0.04

0.72 ± 0.04

NB

Use Kernel Classifier = “true”

Use Supervise discretization =  “true”

Batch size = “100”

Number of decimal places = “100”

0.68 ± 0.04

0.65 ± 0.05

0.69 ± 0.045

0.66 ± 0.04