Skip to main content
. 2016 Feb 12;8:plw009. doi: 10.1093/aobpla/plw009

Table 2.

Optimal parameter settings used in calibrating the BRTs that produced the best performing introduction–naturalization–invasion models. To reduce overfitting, we used cross-validation that was performed by splitting 75 % of the data for training the model and 25 % for testing. We tested various learning rates (0.1–0.0005), bag fractions (0.1–0.8) and levels of tree complexity (1–5). By trial and error, we determined the most effective algorithm parameters for our dataset, which is depicted below.

Introduction model Naturalization model Invasion model
Sample size (n)
 Full dataset 3494 514 46
 Training data 2621 386
 Test data 873 128
Parameters
 Learning rate 0.001 0.001 0.001
 Tree complexity 3 3 3
 Bag fraction 0.5 0.5 0.75