Algorithm 4: Extreme Gradient Decision-Boosting Machine (xgBoost) algorithm |
Input: Extracted feature data with labels y and test data
|
Output: Class labels (glaucoma (GA), diabetic retinopathy (DR), cataract (CT), and normal (NL)). |
Process: |
Step 1. First, initialize tree as a constant: for optimization and parameter defined for the classifier. |
Step 2. Construct XgBoost tree by minimizing the lost function using Equation (2). |
The XgBoost() classifier is perfectly trained on the feature samples . |
Step 3. Repeat Step 2 until the model reaches the stop condition. |
Step 4. Test samples are assigned the class label using the decision function of Equation (2). |