Skip to main content
. 2020 Oct 22;20(21):5979. doi: 10.3390/s20215979
Algorithm 1: General learning algorithm for decision trees with two classes (A and B)
  • 1

    Create a root node v for the decision tree. Let D[v] denote the learning data sample.

  • 2

    If D[v] contains examples of the A class only, mark the node v as terminal, and assign a class label A. Return.

  • 3

    If D[v] contains examples of the B class only, mark the node v as terminal, and assign a class label B. Return.

  • 4

    Try to find the most discriminant attribute and condition on it for the data sample D[v]; otherwise, mark the node v as terminal, and assign a class label represented by most of the data sample D[v]. Return.

  • 5

    Add two edges to the node v with two new nodes v1 and v2. Split the data sample D[v] into two new samples, D[v1] and D[v2], according to the condition found.

  • 6

    Run the algorithm for v1 and D[v1].

  • 7

    Run the algorithm for v2 and D[v2].