Skip to main content
. 2019 Mar 8;19(5):1193. doi: 10.3390/s19051193
Algorithm 1 Decision tree generation algorithm
  1. Construct training set from sensor data

    X is a matrix of R×M, Xij represents the j-th feature of the i-th sample.

    Y is a matrix of R×1, Yi denotes the class label of the i-th sample.

  2. Build a decision tree

   If all the sample values of X are the same, or all the class labels of Y are the same, or R<2, a leaf node is generated, and the class of this node is the class of the most number in X.
   else:
    Select m randomly from M features.
    Among these m features, the maximum information gain is denoted as p.
    If: the value of feature p is discontinuous
     V is any value of p
     XV is used to represent the sample whose feature p takes V, YV is the corresponding class.
     Childv=Generate(XV,YV)
     Return a decision tree node
    If: the value of feature p is continuous
     t is the best split threshold.
     If: XLO represents a sample set whose values of feature p is less than t, and YLO is its corresponding class.
     Childlo=Generate(XLO,YLO)
     If: XHI represents a sample set whose values of feature p is greater than or equal to t, and YHI is its corresponding class.
       Childlo=Generate(XHI,YHI)
     Return a decision tree node