Skip to main content
. 2019 Sep 16;21(9):897. doi: 10.3390/e21090897
Algorithm 1 Constructing a PCC-Tree
  • Require: A root node X=xii=1N, where xi is the i th instance with n condition attributes Akk=1n and one decision attribute D; the stopping criterion ε.

  • Ensure: A PCC-Tree.

  • if the samples in X belong to some class then

  •   Mark X as a leaf node and assign the class as its label.

  •   return.

  • end if

  • for each attribute Ak,k=1,2,,n in X do

  •   for each value ci in Ak do

  •    Compute the Pearson’s correlation coefficient P of two vectors: Pcj(Ak)=P(V(Ak,cj) and V(D)),

  •    where P denotes Pearson’s correlation coefficient and V denotes one vector.

  •   end for

  •   cjk*=argmaxcjPcjAk.

  • end for

  •  Get the best attribute Ak* and the splitting point ck*, where k*=argmaxkPcjAk.

  •  Suppose p(X) is the proportion of samples covered by X.

  • if p(X)<ε then

  •   Mark X as leaf node.

  •   Assign the maximum class of samples in X to this leaf node.

  •   return

  • else

  •   Split X into two subsets X1 and X2, based on Ak* and ck*.

  •   if p(X1)==0 or p(X2)==0 then

  •    Mark X as a leaf node.

  •    Assign the maximum class of samples in X to this leaf node.

  •    return

  •   end if

  •   Recursively search the new tree nodes from X1 and X2 by Algorithm 1, respectively.

  • end if