|
Algorithm 1 Constructing a PCC-Tree |
Require: A root node , where is the i th instance with n condition attributes and one decision attribute D; the stopping criterion .
Ensure: A PCC-Tree.
if the samples in X belong to some class then
Mark X as a leaf node and assign the class as its label.
return.
end if
for each attribute in X
do
for each value in
do
Compute the Pearson’s correlation coefficient P of two vectors: and ,
where P denotes Pearson’s correlation coefficient and V denotes one vector.
end for
.
end for
Get the best attribute and the splitting point , where .
Suppose is the proportion of samples covered by X.
if
then
Mark X as leaf node.
Assign the maximum class of samples in X to this leaf node.
return
else
Split X into two subsets and , based on and .
if
or
then
Mark X as a leaf node.
Assign the maximum class of samples in X to this leaf node.
return
end if
Recursively search the new tree nodes from and by Algorithm 1, respectively.
end if
|