Skip to main content
. 2023 Oct 10;13(20):3165. doi: 10.3390/diagnostics13203165
Algorithm 4: Extreme Gradient Decision-Boosting Machine (xgBoost) algorithm
Input: Extracted feature data x=(x1, x2,, xn) with labels y and test data xtest
Output: Class labels (glaucoma (GA), diabetic retinopathy (DR), cataract (CT), and normal (NL)).
Process:
Step 1. First, initialize tree as a constant: yit=f0=0 for optimization and parameter defined for the classifier.
Step 2. Construct XgBoost tree by minimizing the lost function using Equation (2).
The XgBoost() classifier is perfectly trained on the feature samples x=(x1, x2,, xn).
Step 3. Repeat Step 2 until the model reaches the stop condition.
Step 4. Test samples yit are assigned the class label using the decision function of Equation (2).