Algorithm 1 Decision tree generation algorithm |
else: Select randomly from features. Among these features, the maximum information gain is denoted as . If: the value of feature is discontinuous is any value of is used to represent the sample whose feature takes , is the corresponding class. Return a decision tree node If: the value of feature is continuous t is the best split threshold. If: represents a sample set whose values of feature is less than , and is its corresponding class. If: represents a sample set whose values of feature is greater than or equal to , and is its corresponding class. Return a decision tree node |